144 research outputs found
Nonstrict hierarchical reinforcement learning for interactive systems and robots
Conversational systems and robots that use reinforcement learning for policy optimization in large domains often face the problem of limited scalability. This problem has been addressed either by using function approximation techniques that estimate the approximate true value function of a policy or by using a hierarchical decomposition of a learning task into subtasks. We present a novel approach for dialogue policy optimization that combines the benefits of both hierarchical control and function approximation and that allows flexible transitions between dialogue subtasks to give human users more control over the dialogue. To this end, each reinforcement learning agent in the hierarchy is extended with a subtask transition function and a dynamic state space to allow flexible switching between subdialogues. In addition, the subtask policies are represented with linear function approximation in order to generalize the decision making to situations unseen in training. Our proposed approach is evaluated in an interactive conversational robot that learns to play quiz games. Experimental results, using simulation and real users, provide evidence that our proposed approach can lead to more flexible (natural) interactions than strict hierarchical control and that it is preferred by human users
Density functional theory study of the multimode Jahn-Teller effect – ground state distortion of benzene cation
The multideterminental-DFT approach performed to analyze Jahn-Teller (JT) active molecules is described. Extension of this method for the analysis of the adiabatic potential energy surfaces and the multimode JT effect is presented. Conceptually a simple model, based on the analogy between the JT distortion and reaction coordinates gives further information about microscopic origin of the JT effect. Within the harmonic approximation the JT distortion can be expressed as a linear combination of all totally symmetric normal modes in the low symmetry minimum energy conformation, which allows calculating the Intrinsic Distortion Path, IDP, exactly from the high symmetry nuclear configuration to the low symmetry energy minimum. It is possible to quantify the contribution of different normal modes to the distortion, their energy contribution to the total stabilization energy and how their contribution changes along the IDP. It is noteworthy that the results obtained by both multideterminental-DFT and IDP methods for different classes of JT active molecules are consistent and in agreement with available theoretical and experimental values. As an example, detailed description of the ground state distortion of benzene cation is given
Making effective use of healthcare data using data-to-text technology
Healthcare organizations are in a continuous effort to improve health
outcomes, reduce costs and enhance patient experience of care. Data is
essential to measure and help achieving these improvements in healthcare
delivery. Consequently, a data influx from various clinical, financial and
operational sources is now overtaking healthcare organizations and their
patients. The effective use of this data, however, is a major challenge.
Clearly, text is an important medium to make data accessible. Financial reports
are produced to assess healthcare organizations on some key performance
indicators to steer their healthcare delivery. Similarly, at a clinical level,
data on patient status is conveyed by means of textual descriptions to
facilitate patient review, shift handover and care transitions. Likewise,
patients are informed about data on their health status and treatments via
text, in the form of reports or via ehealth platforms by their doctors.
Unfortunately, such text is the outcome of a highly labour-intensive process if
it is done by healthcare professionals. It is also prone to incompleteness,
subjectivity and hard to scale up to different domains, wider audiences and
varying communication purposes. Data-to-text is a recent breakthrough
technology in artificial intelligence which automatically generates natural
language in the form of text or speech from data. This chapter provides a
survey of data-to-text technology, with a focus on how it can be deployed in a
healthcare setting. It will (1) give an up-to-date synthesis of data-to-text
approaches, (2) give a categorized overview of use cases in healthcare, (3)
seek to make a strong case for evaluating and implementing data-to-text in a
healthcare setting, and (4) highlight recent research challenges.Comment: 27 pages, 2 figures, book chapte
A Biological Global Positioning System: Considerations for Tracking Stem Cell Behaviors in the Whole Body
Many recent research studies have proposed stem cell therapy as a treatment for cancer, spinal cord injuries, brain damage, cardiovascular disease, and other conditions. Some of these experimental therapies have been tested in small animals and, in rare cases, in humans. Medical researchers anticipate extensive clinical applications of stem cell therapy in the future. The lack of basic knowledge concerning basic stem cell biology-survival, migration, differentiation, integration in a real time manner when transplanted into damaged CNS remains an absolute bottleneck for attempt to design stem cell therapies for CNS diseases. A major challenge to the development of clinical applied stem cell therapy in medical practice remains the lack of efficient stem cell tracking methods. As a result, the fate of the vast majority of stem cells transplanted in the human central nervous system (CNS), particularly in the detrimental effects, remains unknown. The paucity of knowledge concerning basic stem cell biology—survival, migration, differentiation, integration in real-time when transplanted into damaged CNS remains a bottleneck in the attempt to design stem cell therapies for CNS diseases. Even though excellent histological techniques remain as the gold standard, no good in vivo techniques are currently available to assess the transplanted graft for migration, differentiation, or survival. To address these issues, herein we propose strategies to investigate the lineage fate determination of derived human embryonic stem cells (hESC) transplanted in vivo into the CNS. Here, we describe a comprehensive biological Global Positioning System (bGPS) to track transplanted stem cells. But, first, we review, four currently used standard methods for tracking stem cells in vivo: magnetic resonance imaging (MRI), bioluminescence imaging (BLI), positron emission tomography (PET) imaging and fluorescence imaging (FLI) with quantum dots. We summarize these modalities and propose criteria that can be employed to rank the practical usefulness for specific applications. Based on the results of this review, we argue that additional qualities are still needed to advance these modalities toward clinical applications. We then discuss an ideal procedure for labeling and tracking stem cells in vivo, finally, we present a novel imaging system based on our experiments
A new benchmark dataset with production methodology for short text semantic similarity algorithms
This research presents a new benchmark dataset for evaluating Short Text Semantic Similarity (STSS) measurement algorithms and the methodology used for its creation. The power of the dataset is evaluated by using it to compare two established algorithms, STASIS and Latent Semantic Analysis. This dataset focuses on measures for use in Conversational Agents; other potential applications include email processing and data mining of social networks. Such applications involve integrating the STSS algorithm in a complex system, but STSS algorithms must be evaluated in their own right and compared with others for their effectiveness before systems integration. Semantic similarity is an artifact of human perception; therefore its evaluation is inherently empirical and requires benchmark datasets derived from human similarity ratings. The new dataset of 64 sentence pairs, STSS-131, has been designed to meet these requirements drawing on a range of resources from traditional grammar to cognitive neuroscience. The human ratings are obtained from a set of trials using new and improved experimental methods, with validated measures and statistics. The results illustrate the increased challenge and the potential longevity of the STSS-131 dataset as the Gold Standard for future STSS algorithm evaluation. © 2013 ACM 1550-4875/2013/12-ART17 15.00
Lead(II) coordination polymers driven by pyridine-hydrazine donors : from anion-guided self-assembly to structural features
In this work, we report extensive experimental and theoretical investigations on a new series of PbII coordination polymers exhibiting extended supramolecular architectures, namely [Pb2(LI)(NCS)4]n (1), [Pb(HLII)I2]n (2), [Pb(LIII)I]n (3) and [Pb(HLIV)(NO3)2]n·nMeOH (4), which were self-assembled from different PbII salts and various pyridine-hydrazine based linkers, namely 1,2-bis(pyridin-3-ylmethylene)hydrazine (LI), (pyridin-4-ylmethylene)isonicotinohydrazide (HLII), 1-(pyridin-2-yl)ethylidenenicotinohydrazide (HLIII) and phenyl(pyridin-2-yl)methylenenicotinohydrazide (HLIV), respectively. It is recognized that the origin of self-assembling is fundamentally rooted in a dual donor (6s2/6p0 hybridized lone electron pair) and electrophilic behaviour of PbII. This allows production of extended topologies from a 1D polymeric chain in 4 through a 2D layer in 2 to the 3D frameworks in 1 and 3, predominantly due to the cooperative action of both covalent and non-covalent tetrel interactions of the overall type Pb-X (X = O, N, S, I). Counterintuitively, the latter, seemingly weak interactions, have appeared to be even stronger than the typical covalent bonds due to the presence of a bunch of supportive London dispersion dominated contacts: ππ, Lpπ, C-HO, C-HI, C-HH-C as well as more typical mainly electrostatically driven N-HO or N/O-HO hydrogen bonds. It is revealed that the constituting generally strong tetrel type Pb-X (X = O, N, S, I) bonds, though dominated by a classic Coulomb term, are therefore characterized by a very important London dispersion constituent, extremely strong relativistic effects and the two way dative-covalent Pb ↔ X electron charge delocalization contribution as revealed by the Extended Transition State Natural Orbital for Chemical Valence (ETS-NOCV) charge and energy decomposition scheme. It unravels that the pyridine-hydrazine linkers are also excellent London dispersion donors, and that together with the donor-acceptor properties of the heavy (relativistic) PbII atoms and nucleophilic counterions lead to extended self-assembling of 1-4
A Novel Method for Specimen Preparation and Analysis of CVD Diamond Coated Tools Using Focussed Ion Beams (FIB) and Scanning Electron Microscopy (SEM)
Investigations of the microscopic properties of chemical vapor deposited diamond coatings on tungsten carbide tools are important to understand the coating–substrate interface (Uhlmann et al. in Prod Eng Res Dev 11(2):83–86, 2004; Surf Coat Technol 131, 395–399, 2000), the coating morphology, and the properties of cracks. Commonly the microscopic properties are analyzed in the transmission electron microscope (TEM). This paper presents a novel investigation method that includes a faster specimen preparation and that offers new analyzing possibilities. It applies a well-established device combining a scanning focused ion beam (FIB) column for the preparation of the specimens and a scanning electron microscope (SEM) for the analysis of the specimen. The aim of the paper is to verify preparation parameters for which the microscopic properties of the diamond coating are preserved during the FIB preparation and to show that the SEM analysis provides the same results of microstructure and element distribution compared to the TEM analysis. Furthermore, the feasibility of new analysis methods is studied. The FIB/SEM method traces and images defects in the nanometer range like cracks, crack propagation directions, delaminations, and layer inhomogeneities directly. The bulk FIB-prepared specimens are suitable for a precise standard-based element quantification and distribution analysis using energy or wavelength (WDX) dispersive X-ray spectroscopy. Moreover, the WDX analysis distinguishes between graphite-like carbon and diamond
- …
