646 research outputs found

    Input variable selection in time-critical knowledge integration applications: A review, analysis, and recommendation paper

    Get PDF
    This is the post-print version of the final paper published in Advanced Engineering Informatics. The published article is available from the link below. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. Copyright @ 2013 Elsevier B.V.The purpose of this research is twofold: first, to undertake a thorough appraisal of existing Input Variable Selection (IVS) methods within the context of time-critical and computation resource-limited dimensionality reduction problems; second, to demonstrate improvements to, and the application of, a recently proposed time-critical sensitivity analysis method called EventTracker to an environment science industrial use-case, i.e., sub-surface drilling. Producing time-critical accurate knowledge about the state of a system (effect) under computational and data acquisition (cause) constraints is a major challenge, especially if the knowledge required is critical to the system operation where the safety of operators or integrity of costly equipment is at stake. Understanding and interpreting, a chain of interrelated events, predicted or unpredicted, that may or may not result in a specific state of the system, is the core challenge of this research. The main objective is then to identify which set of input data signals has a significant impact on the set of system state information (i.e. output). Through a cause-effect analysis technique, the proposed technique supports the filtering of unsolicited data that can otherwise clog up the communication and computational capabilities of a standard supervisory control and data acquisition system. The paper analyzes the performance of input variable selection techniques from a series of perspectives. It then expands the categorization and assessment of sensitivity analysis methods in a structured framework that takes into account the relationship between inputs and outputs, the nature of their time series, and the computational effort required. The outcome of this analysis is that established methods have a limited suitability for use by time-critical variable selection applications. By way of a geological drilling monitoring scenario, the suitability of the proposed EventTracker Sensitivity Analysis method for use in high volume and time critical input variable selection problems is demonstrated.E

    Assessing the effectiveness of goal-oriented modeling languages: A family of experiments

    Full text link
    [EN] Context Several goal-oriented languages focus on modeling stakeholders' objectives, interests or wishes. However, these languages can be used for various purposes (e.g., exploring system solutions or evaluating alternatives), and there are few guidelines on how to use these models downstream to the software requirements and design artifacts. Moreover, little attention has been paid to the empirical evaluation of this kind of languages. In a previous work, we proposed value@GRL as a specialization of the Goal Requirements Language (GRL) to specify stakeholders' goals when dealing with early requirements in the context of incremental software development. Objective: This paper compares the value@GRL language with the i* language, with respect to the quality of goal models, the participants' modeling time and productivity when creating the models, and their perceptions regarding ease of use and usefulness. Method: A family of experiments was carried out with 184 students and practitioners in which the participants were asked to specify a goal model using each of the languages. The participants also filled in a questionnaire that allowed us to assess their perceptions. Results: The results of the individual experiments and the meta-analysis indicate that the quality of goal models obtained with value@GRL is higher than that of i*, but that the participants required less time to create the goal models when using i*. The results also show that the participants perceived value@GRL to be easier to use and more useful than i* in at least two experiments of the family. Conclusions: value@GRL makes it possible to obtain goal models with good quality when compared to i*, which is one of the most frequently used goal-oriented modeling languages. It can, therefore, be considered as a promising emerging approach in this area. Several insights emerged from the study and opportunities for improving both languages are outlined.This work was supported by the Spanish Ministry of Science, Innovation and Universities (Adapt@Cloud project, grant number TIN2017-84550-R) and the Programa de Ayudas de Investigación y Desarrollo (PAID-01-17) from the Universitat Politècnica de València.Abrahao Gonzales, SM.; Insfran, E.; Gonzålez-Ladrón-De-Guevara, F.; Fernåndez-Diego, M.; Cano-Genoves, C.; Pereira De Oliveira, R. (2019). Assessing the effectiveness of goal-oriented modeling languages: A family of experiments. Information and Software Technology. 116:1-24. https://doi.org/10.1016/j.infsof.2019.08.003S12411

    Reasoning about uncertainty in empirical results

    Get PDF
    Conclusions that are drawn from experiments are subject to varying degrees of uncertainty. For example, they might rely on small data sets, employ statistical techniques that make assumptions that are hard to verify, or there may be unknown confounding factors. In this paper we propose an alternative but complementary mechanism to explicitly incorporate these various sources of uncertainty into reasoning about empirical findings, by applying Subjective Logic. To do this we show how typical traditional results can be encoded as "subjective opinions" -- the building blocks of Subjective Logic. We demonstrate the value of the approach by using Subjective Logic to aggregate empirical results from two large published studies that explore the relationship between programming languages and defects or failures

    A Value-Driven Framework for Software Architecture

    Get PDF
    Software that is not aligned with the business values of the organization for which it was developed does not entirely fulfill its raison d’etre. Business values represent what is important in a company, or organization, and should influence the overall software system behavior, contributing to the overall success of the organization. However, approaches to derive a software architecture considering the business values exchanged between an organization and its market players are lacking. Our quest is to address this problem and investigate how to derive value-centered architectural models systematically. We used the Technology Research method to address this PhD research question. This methodological approach proposes three steps: problem analysis, innovation, and validation. The problem analysis was performed using systematic studies of the literature to obtain full coverage on the main themes of this work, particularly, business value modeling, software architecture methods, and software architecture derivation methods. Next, the innovation step was accomplished by creating a framework for the derivation of a software reference architecture model considering an organization’s business values. The resulting framework is composed of three core modules: Business Value Modeling, Agile Reference Architecture Modeling, and Goal-Driven SOA Architecture Modeling. While the Business value modeling module focuses on building a stakeholder-centric business specification, the Agile Reference Architecture Modeling and the Goal-Driven SOA Architecture Modeling modules concentrate on generating a software reference architecture aligned with the business value specification. Finally, the validation part of our framework is achieved through proof-of-concept prototypes for three new domain specific languages, case studies, and quasi-experiments, including a family of controlled experiments. The findings from our research show that the complexity and lack of rigor in the existing approaches to represent business values can be addressed by an early requirements specification method that represents the value exchanges of a business. Also, by using sophisticated model-driven engineering techniques (e.g., metamodels, model transformations, and model transformation languages), it was possible to obtain source generators to derive a software architecture model based on early requirements value models, while assuring traceability throughout the architectural derivation process. In conclusion, despite using sophisticated techniques, the derivation process of a software reference architecture is helped by simple to use methods supported by black box transformations and guidelines that facilitate the activities for the less experienced software architects. The experimental validation process used confirmed that our framework is feasible and perceived as easy to use and useful, also indicating that the participants of the experiments intend to use it in the future

    Preference modelling approaches based on cumulative functions using simulation with applications

    Get PDF
    In decision making problems under uncertainty, Mean Variance Model (MVM) consistent with Expected Utility Theory (EUT) plays an important role in ranking preferences for various alternative options. Despite its wide use, this model is appropriate only when random variables representing the alternative options are normally distributed and the utility function to be maximized is quadratic; both are undesirable properties to be satisfied with actual applications. In this research, a novel methodology has been adopted in developing generalized models that can reduce the deficiency of the existing models to solve large-scale decision problems, along with applications to real-world disputes. More specifically, for eliciting preferences for pairs of alternative options, two approaches are developed: one is based on Mean Variance Model (MVM), which is consistent with Expected Utility Theory (EUT), and the second is based on Analytic Hierarchy Processes (AHP). The main innovation in the first approach is in reformulating MVM to be based on cumulative functions using simulation. Two models under this approach are introduced: the first deals with ranking preferences for pairs of lotteries/options with non-negative outcomes only while the second, which is for risk modelling, is a risk-preference model that concerns normalized lotteries representing risk factors each is obtained from a multiplication decomposition of a lottery into its mean multiplied by a risk factor. Both approximation models, which are preference-based using the determined values for expected utility, have the potential to accommodate various distribution functions with different utility functions and capable of handling decision problems especially those encountered in financial economics. The study then reformulates the second approach, AHP; a new algorithm, using simulation, introduces an approximation method that restricts the level of inherent uncertainty to a certain limit. The research further focuses on proposing an integrated preference-based AHP model introducing a novel approximation stepwise algorithm that combines the two modified approaches, namely MVM and AHP; it multiplies the determined value for expected utility, which results from implementing the modified MVM, by the one obtained from processing AHP to obtain an aggregated weight indicator. The new integrated weight scale represents an accurate and flexible tool that can be employed efficiently to solve decision making problems for possible scenarios that concern financial economics Finally, to illustrate how the integrated model can be used as a practical methodology to solve real life selection problems, this research explores the first empirical case study on Tender Selection Process (TSP) in Kurdistan Region (KR) of Iraq; it is considered as an inductive and a comprehensive investigation on TSP, which has received minimum consideration in the region, and regarded as a significant contribution to this research. The implementation of the proposed model to this case study shows that, for the evaluation of construction tenders, the integrated approach is an appropriate model, which can be easily modified to assume specific conditions of the proposed project. Using simulation, generated data allows creation of a feedback system that can be utilized for the evaluation of future projects in addition to its capability to make data handling easier and the evaluation process less complex and time consuming

    Cost-Optimal Service Selection Based on Incident Patterns - A Service Level Engineering Approach

    Get PDF
    The assessment of services\u27 adverse business impact is a prerequisite for the determination of cost-optimal service offers. We suggest service incident patterns as an advanced form of quality measures describing a service\u27s characteristic incident behavior. We show that the knowledge about service incident patterns and the business impact induced by service incidents is required to determine the total business costs a service induces and develop the method for Cost-Optimal Service Selection
    • …
    corecore