833 research outputs found

    2D parallel thinning and shrinking based on sufficient conditions for topology preservation

    Get PDF
    Thinning and shrinking algorithms, respectively, are capable of extracting medial lines and topological kernels from digital binary objects in a topology preserving way. These topological algorithms are composed of reduction operations: object points that satisfy some topological and geometrical constraints are removed until stability is reached. In this work we present some new sufficient conditions for topology preserving parallel reductions and fiftyfour new 2D parallel thinning and shrinking algorithms that are based on our conditions. The proposed thinning algorithms use five characterizations of endpoints

    Acta Cybernetica : Volume 20. Number 1.

    Get PDF

    Empirical Analysis of Electron Beam Lithography Optimization Models from a Pragmatic Perspective

    Get PDF
    Electron Beam (EB) lithography is a process of focussing electron beams on silicon wafers to design different integrated circuits (ICs). It uses an electron gun, a blanking electrode, multiple electron lenses, a deflection electrode, and control circuits for each of these components. But the lithography process causes critical dimension overshoots, which reduces quality of the underlying ICs. This is caused due to increase in beam currents, frequent electron flashes, and reducing re-exposure of chip areas. Thus, to overcome these issues, researchers have proposed a wide variety of optimization models, each of which vary in terms of their qualitative & quantitative performance. These models also vary in terms of their internal operating characteristics, which causes ambiguity in identification of optimum models for application-specific use cases. To reduce this ambiguity, a discussion about application-specific nuances, functional advantages, deployment-specific limitations, and contextual future research scopes is discussed in this text. Based on this discussion, it was observed that bioinspired models outperform linear modelling techniques, which makes them highly useful for real-time deployments. These models aim at stochastically evaluation of optimum electron beam configurations, which improves wafer’s quality & speed of imprinting when compared with other models. To further facilitate selection of these models, this text compares them in terms of their accuracy, throughput, critical dimensions, deployment cost & computational complexity metrics. Based on this discussion, researchers will be able to identify optimum models for their performance-specific use cases. This text also proposes evaluation of a novel EB Lithography Optimization Metric (EBLOM), which combines multiple performance parameters for estimation of true model performance under real-time scenarios. Based on this metric, researchers will be able to identify models that can perform optimally with higher performance under performance-specific constraints

    Flexible constrained sampling with guarantees for pattern mining

    Get PDF
    Pattern sampling has been proposed as a potential solution to the infamous pattern explosion. Instead of enumerating all patterns that satisfy the constraints, individual patterns are sampled proportional to a given quality measure. Several sampling algorithms have been proposed, but each of them has its limitations when it comes to 1) flexibility in terms of quality measures and constraints that can be used, and/or 2) guarantees with respect to sampling accuracy. We therefore present Flexics, the first flexible pattern sampler that supports a broad class of quality measures and constraints, while providing strong guarantees regarding sampling accuracy. To achieve this, we leverage the perspective on pattern mining as a constraint satisfaction problem and build upon the latest advances in sampling solutions in SAT as well as existing pattern mining algorithms. Furthermore, the proposed algorithm is applicable to a variety of pattern languages, which allows us to introduce and tackle the novel task of sampling sets of patterns. We introduce and empirically evaluate two variants of Flexics: 1) a generic variant that addresses the well-known itemset sampling task and the novel pattern set sampling task as well as a wide range of expressive constraints within these tasks, and 2) a specialized variant that exploits existing frequent itemset techniques to achieve substantial speed-ups. Experiments show that Flexics is both accurate and efficient, making it a useful tool for pattern-based data exploration.Comment: Accepted for publication in Data Mining & Knowledge Discovery journal (ECML/PKDD 2017 journal track

    Parameterized Algorithmics for Computational Social Choice: Nine Research Challenges

    Full text link
    Computational Social Choice is an interdisciplinary research area involving Economics, Political Science, and Social Science on the one side, and Mathematics and Computer Science (including Artificial Intelligence and Multiagent Systems) on the other side. Typical computational problems studied in this field include the vulnerability of voting procedures against attacks, or preference aggregation in multi-agent systems. Parameterized Algorithmics is a subfield of Theoretical Computer Science seeking to exploit meaningful problem-specific parameters in order to identify tractable special cases of in general computationally hard problems. In this paper, we propose nine of our favorite research challenges concerning the parameterized complexity of problems appearing in this context
    corecore