49 research outputs found

    Extracting Alternative Machining Features: Al Algorithmic Approach

    Get PDF
    Automated recognition of features from CAD models has been attempted for a wide range of application domains. In this paper we address the problem of representing and recognizing the complete class of features in alternative interpretations for a given design. We present a formalism for representing feature- based design alternatives and a methodology for recognizing a class of machinable features. Our approach handles a class of volumetric features that describe material removal volumes made by operations on the three-axis vertical machining centers including: drilling, pocket-, slot-, and face-miling, chamfering, filleting, and blended surfaces. Our approach recognizes intersecting features, and is complete over all features in our class, i.e. for any given part, the algorithm produces a set containing all features in our class that correspond to possible operations for machining that part. This property is of particular significance in applications where consideration of different manufacturing alternatives is crucial. In addition, we have shown that the algorithms are, in the worst-case, euqdratic in the number solid modeling operations. This approach employs a class of machinable features expressible as MRSEVs ( a STEP- based library of machining features). An implementation of these algorithms has been done using the ACISsolid modeler and the NIH C++ class library

    On the Nature of Modal Truth in Plans

    No full text
    Chapman's paper, "Planning for Conjunctive Goals", has been widely acknowledged as a major step towards understanding the nature of nonlinear planning, and it has been one of the bases of later work by others -- but it is not free of problems. This paper discusses the following problems with modal truth and the modal truth criterion.1. It is NP-hard to tell, given a plan P and a ground atom p, whether P is possibly true in P's final situation. This is true despite the fact that modal truth criterion can be computed in a polynomial time.2. The reason for this discrepancy is that the "possible truth" version of the modal truth criterion is incorrect. It tells whether - p is not necessarily true --- but this is different from telling whether p is possibly true. Possible truth is not the dual of necessary truth, as Chapman had thought it was.3. Instead, possible truth is the dual of another problem, which is co-NP-hard: the problem of determining whether p is true over all executable completions of a plan.Despite the above problems, the "necessary truth" version of the modal truth criterion (and hence the TWEAK planner) are still correct

    State-Space Search, Problem Reduction, and Iterative Deepening: A Comparative Analysis

    No full text
    In previous work, Korf showed that by introducing one problem- reduction step into a state- space search, one could reduce the number of node generations from 0 ((2b)2d ) to 0 (bd ), where b and d are the branching factor and search depth. My results are as follows: 1. The 0(bd) bound is tight, but the 0((bd)2d ) bound is not: the A* procedure does only Q(b2d ) node generations. Thus, the improvement produced by one problem- reduction step is not always as great as the previous results might suggest.2. In an AND/OR tree where multiple problem- reduction steps are possible, problem reduction produces a much more dramatic improvement: both the time complexity and the space complexity decrease from doubly exponential to singly exponential.3. For iterative-deepening procedures like IDA* that only remember the nodes on the current path, the space complexity decreases but the time complexity increases - by exponential amounts in Korf's model, and doubly exponential amounts in the AND/OR-tree model. This is true even for IDAO*, a new procedure that improves IDA*'s performance by combining it with problem reduction.These results lead to the following conclusions: In general, problem reduction can save huge amounts of both time and space. Whether to use a procedure that remembers every node it has visited, or instead use a limited-memory iterative-deepening procedure, depends on whether the primary objective is to save space or save time.<P

    Hierarchical Abstraction for Process Planning.

    No full text
    In most frame-based reasoning systems, the data manipulated by the system is represented using frames, and the problem-solving knowledge used to manipulate this data consists of rules. However, rules are not always the best way to represent problem solving knowledge

    The Preprocessing of Search Spaces for Branch and Bound Search.

    No full text
    Heuristic search procedures are useful in a large number of problems of practical importance. Such procedures operate by searching several paths in a search space at the same time, expanding some paths more quickly than others depending on which paths look most promising. Often large amounts of time are required in keeping track of the information control knowledge. For some problems, this overhead can be greatly reduced by preprocessing the problem in appropriate ways. In particular, we discuss a data structure called a threaded decision graph, which can be created by preprocessing the search space for some problems, and which captures the control knowledge for problem solving. We show how this can be done, and we present an analysis showing that by using such a method, a great deal of time can be saved during problem solving processes

    Toward an Analysis of Forward Pruning

    No full text
    Several early game-playing computer programs used forward pruning (i.e., the practice of deliberately ignoring nodes that are believed unlikely to affect a game tree's minimax value), but this technique did not seem to result in good decision-making. The poor performance of forward pruning presents a major puzzle for AI research on game playing, because some version of forward pruning seems to be "what people do", and the best chess-playing programs still do not play as well as the best humans.As a step toward deeper understanding of how forward pruning affects quality of play, in this paper we set up a model of forward pruning on two abstract classes of binary game trees, and we use this model to investigate how forward pruning affects the accuracy of the minimax values returned. The primary result of our study is that forward pruning does better when there is a high correlation among the minimax values of sibling nodes in a game tree.This result suggests that forward pruning may possibly be a useful decision-making technique in certain kinds of games. In particular, we believe that bridge may be such a game

    Computing Geometric Boolean Operations by Input Directed Decomposition.

    No full text
    This paper presents an algorithm to perform regularized Boolean operations on collections of simple polygons. The algorithm accepts two arbitrarily complex collections of disjoint polygons and returns two collections of polygons corresponding to the union and intersection respectively. The algorithm is efficient and generalizes to higher dimensions. Given two collections of polygons, the algorithm recursively decomposes them into fragments using splitting lines determined from the collections' edges. This approach, which is called input directed decomposition, maintains exact representations of objects, and easily classifies an edge into either the union or the intersection set. By the use of edge orientation information, ambiguities caused by objects that touch along an edge are avoided. After edge classification, edge connectivity of polygons is used to allow creation of the polygons belonging to the union and the intersection collections

    Obtaining Boundaries with Respect A Simple Approach to Perfonning Set Operation on Polyhedra.

    No full text
    A regularized set operation on two solids can be separated into four steps: partition the faces of the boundaries of the two solids to impose respect, obtain an eightway classification of the faces, create a solid according to the set operation, and reduce the representation to its minimal form. Of these four steps' the first step is the most difficult. This paper presents and proves correct a general approach for imposing respect on two boundary representation. The approach is based on data-driven, binary form of decomposition

    A Systematic Approach for Analyzing the Manufacturability of Machined Parts

    No full text
    The ability to quickly introduce new quality products is a decisive factor in capturing market share. Because of pressing demands to reduce lead time, analyzing the manufacturability of the proposed design has become an important step in the design stage. This paper presents an approach for analyzing the manufacturability of machined parts.Evaluating the manufacturability of a proposed design involves determining whether or not it is manufacturable with a given set of manufacturing operations - and if so, then finding the associated manufacturing efficiency. Since there can be several different ways to manufacture a proposed design, this requires us to consider different ways to manufacture it, in order to determine which one best meets the design and manufacturing objectives.The first step in our approach is to identify all machining operations which can potentially be used to create the given design. Using these operations, we generate different operation plans for machining the part. Each time we generate a new operation plan, we examine whether it can produce the desired shape and tolerances, and calculate its manufacturability rating. If no operation plan can be found that is capable of producing the design, then the given design is considered unmachinable; otherwise, the manufacturability rating for the design is the rating of the best operation plan.We anticipate that by providing feedback about possible problems with the design, this work will help in speeding up the evaluation of new product designs in order to decide how or whether to manufacture them. Such a capability will be useful in responding quickly to changing demands and opportunities in the marketplace

    Generation of Alternative Feature-Based Models and Precedence Orderings for Machining Applications

    No full text
    For machining purposes, a part is often considered to be a feature-based model (FBM), i.e., a collection of machining features. However, often there can be several different FBM's of the same part. These models correspond to different sets of machining operations, with different precedence constraints. Which of these sets of machining operations is best depends on several factors, including dimensions, tolerances, surface finishes, availability of machine tools and cutting tools, fixturability, and optimization criteria. Thus, these alternatives should be generated and evaluated.In this paper we present the following results: 1. We give general mathematical definitions of machining features and FBMs.2. We present a systematic way to generate the alternative FBMs for a part, given an initial FBM for the part.3. For each FBM, interactions among the features will impose precedence constraints on the possible orderings in which these features can be machined. We show how to generate these precedence constraints automatically for each interpretation.4. We show how to organize the above precedence constraints into a time-order graph that represents all feasible orderings in which the features can be machined, and examine the time-order graph to see if it is consistent. If it is not consistent, then there is no way to machine this particular interpretation.This work represents a step toward our overall approach of developing ways for automatically generating the alternative ways in which a part can be machined, and evaluating them to see how well they can do at creating the desired part. We anticipate that the information provided by this analysis will be useful both for process planning and concurrent design
    corecore