545,485 research outputs found

    Learning a Planning Domain Model from Natural Language Process Manuals

    Get PDF
    Artificial intelligence planning techniques have been widely used in many applications. A big challenge is to automate a planning model, especially for planning applications based on natural language (NL) input. This requires the analysis and understanding of NL text and a general learning technique does not exist in real-world applications. In this article, we investigate an intelligent planning technique for natural disaster management, e.g. typhoon contingency plan generation, through natural language process manuals. A planning model is to optimise management operations when a disaster occurs in a short time. Instead of manually building the planning model, we aim to automate the planning model generation by extracting disaster management-related content through NL processing (NLP) techniques. The learning input comes from the published documents that describe the operational process of preventing potential loss in the typhoon management. We adopt a classical planning model, namely planning domain definition language (PDDL), in the typhoon contingency plan generation. We propose a novel framework of FPTCP, which stands for a Framework of Planning Typhoon Contingency Plan , for learning a domain model of PDDL from NL text. We adapt NLP techniques to construct a ternary template of sentences of NL inputs from which actions and their objects are extracted to build a domain model. We also develop a comprehensive suite of user interaction components and facilitate the involvement of users in order to improve the learned domain models. The user interaction is to remove semantic duplicates of NL objects such that the users can select model-generated actions and predicates to better fit the PDDL domain model. We detail the implementation steps of the proposed FPTCP and evaluate its performance on real-world typhoon datasets. In addition, we compare FPTCP with two state-of-the-art approaches in applications of narrative generation, and discuss its capability and limitations

    Computational structure‐based drug design: Predicting target flexibility

    Get PDF
    The role of molecular modeling in drug design has experienced a significant revamp in the last decade. The increase in computational resources and molecular models, along with software developments, is finally introducing a competitive advantage in early phases of drug discovery. Medium and small companies with strong focus on computational chemistry are being created, some of them having introduced important leads in drug design pipelines. An important source for this success is the extraordinary development of faster and more efficient techniques for describing flexibility in three‐dimensional structural molecular modeling. At different levels, from docking techniques to atomistic molecular dynamics, conformational sampling between receptor and drug results in improved predictions, such as screening enrichment, discovery of transient cavities, etc. In this review article we perform an extensive analysis of these modeling techniques, dividing them into high and low throughput, and emphasizing in their application to drug design studies. We finalize the review with a section describing our Monte Carlo method, PELE, recently highlighted as an outstanding advance in an international blind competition and industrial benchmarks.We acknowledge the BSC-CRG-IRB Joint Research Program in Computational Biology. This work was supported by a grant from the Spanish Government CTQ2016-79138-R.J.I. acknowledges support from SVP-2014-068797, awarded by the Spanish Government.Peer ReviewedPostprint (author's final draft

    On-line optimisation and experimental design analysis for the investigations on the surface roughness produced by roller burnishing: a thesis submitted in partial fulfilment of the requirements for the degree of Master of Technology in Manufacturing and Industrial Technology at Massey University

    Get PDF
    This thesis describes the improvement of the Surface finish of metals by a cold working, non-metal removal and plastic deformation process called roller burnishing. Roller burnishing is a popular finishing process. Surface finish has a positive and prolonged effect on the functioning of the machined parts. In this work roller burnishing is used to get a high quality surface finish on different materials like aluminum, copper, mild steel and brass. A roller burnishing tool was designed and fabricated for the project. A test rig was set up on a center lathe to conduct experiments. The angle of approach and radius of the roller burnishing tool were checked for optimisation. Number of passes of the tool was also one of the factors under study for the optimisation. The surface finish of the roller burnished cylindrical surfaces was examined for the soft materials like Aluminum and Copper and also for the hard materials like Mild Steel and Copper. The optimum values of feed, speed and depth of penetration were suggested by conducting a number of experiments varying one factor-at-a-time holding the rest constant. Since all the factors are interdependent, varying one-factor-at-a-time and keeping the rest constant method of experimental optimisation technique will not give accurate results either for the main effects or any interactions present. At same time it is not possible to vary more than one factor at a time experimentally. Hence a theoretical approach focused on the computer based, process parameters and surface quality data acquisition from the shop floor was suggested. The collected data was then analysed by Design of Experiments method, an advanced statistical quality analysis method, to determine the significant process parameters influencing the surface finish. The basic design and analysis of the process was carried out by full factorial and ANOVA for the two level three factor ( 2 3 ) experimental design. More experiments for roller burnishing process were conducted for collection of data using experiments designed by the Central Composite Design (CCD) method. These experiments were used to determine the interactions among the factors. The analysis was carried out by the Response Surface Methodology (RSM) to find the optimum values of the more significant process parameters. The final surface finish for mildsteel was found to be 0.32”m with a feed of 85”m/rev and depth of penetration of 70”m. The results of both experimental and theory were compared

    Optical study of the anisotropic erbium spin flip-flop dynamics

    Full text link
    We investigate the erbium flip-flop dynamics as a limiting factor of the electron spin lifetime and more generally as an indirect source of decoherence in rare-earth doped insulators. Despite the random isotropic arrangement of dopants in the host crystal, the dipolar interaction strongly depends on the magnetic field orientation following the strong anisotropy of the gg-factor. In Er3+^{3+}:Y2_2SiO5_5, we observe by transient optical spectroscopy a three orders of magnitude variation of the erbium flip-flop rate (10ppm dopant concentration). The measurements in two different samples, with 10ppm and 50ppm concentrations, are well-supported by our analytic modeling of the dipolar coupling between identical spins with an anisotropic gg-tensor. The model can be applied to other rare-earth doped materials. We extrapolate the calculation to Er3+^{3+}:CaWO4_4, Er3+^{3+}:LiNbO3_3 and Nd3+^{3+}:Y2_2SiO5_5 at different concentrations

    Developing student’s accounting competencies using Astin’s I-E-O model: an identification of key educational inputs based on Indonesian student perspectives

    Get PDF
    This paper discusses a model for developing Students’ Accounting Competencies (SAC) using Astin’s Input-Environment-Outcome (I-E-O) model. SAC based on AICPA core competency is considered important due to business and environment changes. Student Motivation, Student Previous Achievement, Student Demographic Characteristics, Learning Facilities, and Comfort of Class Size are educational inputs. Student Engagement and SAC are proxies for Environment and Outcome respectively. Empirically, the aforementioned educational inputs except Student Demographic Characteristics are important inputs for improving SAC. Student Engagement effectively mediates the influence of inputs on SAC. The I-E-O model is appropriate for analysing relationships among a single input, Student Engagement, and SAC. This model becomes less powerful for analysing simultaneous relationships among multiple inputs, Student Engagement, and SAC. Future research on using other assessments for gauging SAC, identifying other significant inputs, identifying the impact of real class size on Student Engagement and SAC, and developing Student Engagement for accounting courses are required
    • 

    corecore