157,139 research outputs found

    A Robust F-measure for evaluating discovered process models

    Get PDF

    Evaluating Leniency with Missing Information on Undetected Cartels: Exploring Time-Varying Policy Impacts on Cartel Duration

    Get PDF
    This paper examines the effects of European Commission’s (EC) new leniency program on the EC’s capabilities in detecting and deterring cartels. As a supplementary analysis, the US leniency is studied. I discuss a dynamic model of cartel formation and dissolution to illustrate how changes in antitrust policies and economic conditions might affect cartel duration. Comparative statics results are then corroborated with empirical estimates of hazard functions adjusted to account for both the heterogeneity of cartels and the time-varying policy impacts suggested by theory. Contrary to earlier studies, my statistical tests are consistent with the theoretic predictions that following an efficacious leniency program, the average duration of discovered cartels rises in the short run and falls in the long run. The results shed light on the design of enforcement programs against cartels and other forms of conspiracy

    Evaluating epistemic uncertainty under incomplete assessments

    Get PDF
    The thesis of this study is to propose an extended methodology for laboratory based Information Retrieval evaluation under incomplete relevance assessments. This new methodology aims to identify potential uncertainty during system comparison that may result from incompleteness. The adoption of this methodology is advantageous, because the detection of epistemic uncertainty - the amount of knowledge (or ignorance) we have about the estimate of a system's performance - during the evaluation process can guide and direct researchers when evaluating new systems over existing and future test collections. Across a series of experiments we demonstrate how this methodology can lead towards a finer grained analysis of systems. In particular, we show through experimentation how the current practice in Information Retrieval evaluation of using a measurement depth larger than the pooling depth increases uncertainty during system comparison

    Evaluating Leniency with Missing Information on Undetected Cartels: Exploring Time-Varying Policy Impacts on Cartel Duration

    Get PDF
    This paper examines the effects of European Commission’s (EC) new leniency program on the EC’s capabilities in detecting and deterring cartels. As a supplementary analysis, the US leniency is studied. I discuss a dynamic model of cartel formation and dissolution to illustrate how changes in antitrust policies and economic conditions might affect cartel duration. Comparative statics results are then corroborated with empirical estimates of hazard functions adjusted to account for both the heterogeneity of cartels and the time-varying policy impacts suggested by theory. Contrary to earlier studies, my statistical tests are consistent with the theoretic predictions that following an efficacious leniency program, the average duration of discovered cartels rises in the short run and falls in the long run. The results shed light on the design of enforcement programs against cartels and other forms of conspiracy.

    A new sequential covering strategy for inducing classification rules with ant colony algorithms

    Get PDF
    Ant colony optimization (ACO) algorithms have been successfully applied to discover a list of classification rules. In general, these algorithms follow a sequential covering strategy, where a single rule is discovered at each iteration of the algorithm in order to build a list of rules. The sequential covering strategy has the drawback of not coping with the problem of rule interaction, i.e., the outcome of a rule affects the rules that can be discovered subsequently since the search space is modified due to the removal of examples covered by previous rules. This paper proposes a new sequential covering strategy for ACO classification algorithms to mitigate the problem of rule interaction, where the order of the rules is implicitly encoded as pheromone values and the search is guided by the quality of a candidate list of rules. Our experiments using 18 publicly available data sets show that the predictive accuracy obtained by a new ACO classification algorithm implementing the proposed sequential covering strategy is statistically significantly higher than the predictive accuracy of state-of-the-art rule induction classification algorithms

    A recommender system for process discovery

    Get PDF
    Over the last decade, several algorithms for process discovery and process conformance have been proposed. Still, it is well-accepted that there is no dominant algorithm in any of these two disciplines, and then it is often difficult to apply them successfully. Most of these algorithms need a close-to expert knowledge in order to be applied satisfactorily. In this paper, we present a recommender system that uses portfolio-based algorithm selection strategies to face the following problems: to find the best discovery algorithm for the data at hand, and to allow bridging the gap between general users and process mining algorithms. Experiments performed with the developed tool witness the usefulness of the approach for a variety of instances.Peer ReviewedPostprint (author’s final draft
    corecore