3,254 research outputs found

    Backward assembly planning with DFA analysis

    Get PDF
    An assembly planning system that operates based on a recursive decomposition of assembly into subassemblies is presented. The planning system analyzes assembly cost in terms of stability, directionality, and manipulability to guide the generation of preferred assembly plans. The planning in this system incorporates the special processes, such as cleaning, testing, labeling, etc., that must occur during the assembly. Additionally, the planning handles nonreversible, as well as reversible, assembly tasks through backward assembly planning. In order to decrease the planning efficiency, the system avoids the analysis of decompositions that do not correspond to feasible assembly tasks. This is achieved by grouping and merging those parts that can not be decomposable at the current stage of backward assembly planning due to the requirement of special processes and the constraint of interconnection feasibility. The invention includes methods of evaluating assembly cost in terms of the number of fixtures (or holding devices) and reorientations required for assembly, through the analysis of stability, directionality, and manipulability. All these factors are used in defining cost and heuristic functions for an AO* search for an optimal plan

    Logic Programming as Constructivism

    Get PDF
    The features of logic programming that seem unconventional from the viewpoint of classical logic can be explained in terms of constructivistic logic. We motivate and propose a constructivistic proof theory of non-Horn logic programming. Then, we apply this formalization for establishing results of practical interest. First, we show that 'stratification can be motivated in a simple and intuitive way. Relying on similar motivations, we introduce the larger classes of 'loosely stratified' and 'constructively consistent' programs. Second, we give a formal basis for introducing quantifiers into queries and logic programs by defining 'constructively domain independent* formulas. Third, we extend the Generalized Magic Sets procedure to loosely stratified and constructively consistent programs, by relying on a 'conditional fixpoini procedure

    Technology assessment of advanced automation for space missions

    Get PDF
    Six general classes of technology requirements derived during the mission definition phase of the study were identified as having maximum importance and urgency, including autonomous world model based information systems, learning and hypothesis formation, natural language and other man-machine communication, space manufacturing, teleoperators and robot systems, and computer science and technology

    Diffusion or War? Foucault as a Reader of Tarde

    Get PDF
    The objective of this chapter is to clarify the social theory underlying in Foucault’s genealogy of power/knowledge thanks to a comparison with Tarde’s microsociology. Nietzsche is often identified as the direct (and unique) predecessor of this genealogy, and the habitual criticisms are worried about the intricate relations between Foucault and Marx. These perspectives omit to point to another – and more direct – antecedent of Foucault`s microphysics: the microsociology of Gabriel Tarde. Bio-power technologies must be read as Tardian inventions that, by propagation, have reconfigured pre-existing social spaces, building modern societies. We will see how the Tardean source in Foucault’s genealogy sheds new clarity about the micro-socio-logic involved in it, enabling us to identify some of its aporiae and to imagine some solutions in this respect as well

    Lexical Density in EFL Indonesian Textbooks: A Comparative Analysis

    Get PDF
    The lexical density is known as the component determining the complexity level of the text. Hence, measurement of lexical density is needed to find out how challenging a particular text is to read.  However, it is important to know that the lexical density in English reading materials especially in Indonesian EFL textbooks between the Government Publisher and Non-Government Publisher is still less explored. This research will employ the Quantitative design to investigate how dense the lexical items are found in these two EFL Textbooks. The result showed that the lexical density of the non-government textbook is denser in all of the genres of the text compared to the government textbooks. In brief, by knowing the lexical density in the text, the teacher should have known what the appropriate text level should be taught to the students which is neither too low nor too high

    On the evolution of conceptual modeling

    Get PDF
    Since the 1980s the need increased for overcoming idiosyncrasies of approaches to modeling in the various sub-disciplines of computing. The theoretical model of evolution is used in this paper for analyzing how computing and conceptual modeling have changed. It is concluded that computing has changed into a social phenomenon with a technical core and that therefore relying on (formal) model semantics as the sole tool for the discussion of conceptual modeling is no more adequate. A number of language games of computing is identified and the task set to describe these language games to the extent necessary for deciding whether or not they can serve as the foundation of computing
    • …
    corecore