6,376 research outputs found

    Structural Simplification of Bedaquiline: the Discovery of 3-(4-(N,N-dimethylaminomethyl)phenyl)quinoline Derived Antitubercular Lead Compounds

    Get PDF
    Bedaquiline (BDQ) is a novel and highly potent last-line antituberculosis drug that was approved by the US FDA in 2013. Owing to its stereo-structural complexity, chemical synthesis and compound optimization are rather difficult and expensive. This study describes the structural simplification of bedaquiline while preserving antitubercular activity. The compound's structure was split into fragments and reassembled in various combinations while replacing the two chiral carbon atoms with an achiral linkage instead. Four series of analogues were designed; these candidates retained their potent antitubercular activity at sub-microgram per mL concentrations against both sensitive and multidrug-resistant (MDR) Mycobacterium tuberculosis strains. Six out of the top nine MIC-ranked candidates were found to inhibit mycobacterial ATP synthesis activity with IC50 values between 20 and 40 μm, one had IC50>66 μm, and two showed no inhibition, despite their antitubercular activity. These results provide a basis for the development of chemically less complex, lower-cost bedaquiline derivatives and describe the identification of two derivatives with antitubercular activity against non-ATP synthase related targets

    Star Formation in Violent and Normal Evolutionary Phases

    Get PDF
    Mergers of massive gas-rich galaxies trigger violent starbursts that - over timescales of >100> 100 Myr and regions >10> 10 kpc - form massive and compact star clusters comparable in mass and radii to Galactic globular clusters. The star formation efficiency is higher by 1 - 2 orders of magnitude in these bursts than in undisturbed spirals, irregulars or even BCDs. We ask the question if star formation in these extreme regimes is just a scaled-up version of the normal star formation mode of if the formation of globular clusters reveals fundamentally different conditions.Comment: 4 pages To appear in The Evolution of Galaxies. II. Basic building blocks, eds. M. Sauvage, G. Stasinska, L. Vigroux, D. Schaerer, S. Madde

    Development of a technology adoption and usage prediction tool for assistive technology for people with dementia

    Get PDF
    This article is available open access through the publisher’s website at the link below. Copyright @ The Authors 2013.In the current work, data gleaned from an assistive technology (reminding technology), which has been evaluated with people with Dementia over a period of several years was retrospectively studied to extract the factors that contributed to successful adoption. The aim was to develop a prediction model with the capability of prospectively assessing whether the assistive technology would be suitable for persons with Dementia (and their carer), based on user characteristics, needs and perceptions. Such a prediction tool has the ability to empower a formal carer to assess, through a very limited amount of questions, whether the technology will be adopted and used.EPSR

    Accurate Image Analysis of the Retina Using Hessian Matrix and Binarisation of Thresholded Entropy with Application of Texture Mapping

    Get PDF
    In this paper, we demonstrate a comprehensive method for segmenting the retinal vasculature in camera images of the fundus. This is of interest in the area of diagnostics for eye diseases that affect the blood vessels in the eye. In a departure from other state-of-the-art methods, vessels are first pre-grouped together with graph partitioning, using a spectral clustering technique based on morphological features. Local curvature is estimated over the whole image using eigenvalues of Hessian matrix in order to enhance the vessels, which appear as ridges in images of the retina. The result is combined with a binarized image, obtained using a threshold that maximizes entropy, to extract the retinal vessels from the background. Speckle type noise is reduced by applying a connectivity constraint on the extracted curvature based enhanced image. This constraint is varied over the image according to each region's predominant blood vessel size. The resultant image exhibits the central light reflex of retinal arteries and veins, which prevents the segmentation of whole vessels. To address this, the earlier entropy-based binarization technique is repeated on the original image, but crucially, with a different threshold to incorporate the central reflex vessels. The final segmentation is achieved by combining the segmented vessels with and without central light reflex. We carry out our approach on DRIVE and REVIEW, two publicly available collections of retinal images for research purposes. The obtained results are compared with state-of-the-art methods in the literature using metrics such as sensitivity (true positive rate), selectivity (false positive rate) and accuracy rates for the DRIVE images and measured vessel widths for the REVIEW images. Our approach out-performs the methods in the literature.Xiaoxia Yin, Brian W-H Ng, Jing He, Yanchun Zhang, Derek Abbot

    Saliency Benchmarking Made Easy: Separating Models, Maps and Metrics

    Full text link
    Dozens of new models on fixation prediction are published every year and compared on open benchmarks such as MIT300 and LSUN. However, progress in the field can be difficult to judge because models are compared using a variety of inconsistent metrics. Here we show that no single saliency map can perform well under all metrics. Instead, we propose a principled approach to solve the benchmarking problem by separating the notions of saliency models, maps and metrics. Inspired by Bayesian decision theory, we define a saliency model to be a probabilistic model of fixation density prediction and a saliency map to be a metric-specific prediction derived from the model density which maximizes the expected performance on that metric given the model density. We derive these optimal saliency maps for the most commonly used saliency metrics (AUC, sAUC, NSS, CC, SIM, KL-Div) and show that they can be computed analytically or approximated with high precision. We show that this leads to consistent rankings in all metrics and avoids the penalties of using one saliency map for all metrics. Our method allows researchers to have their model compete on many different metrics with state-of-the-art in those metrics: "good" models will perform well in all metrics.Comment: published at ECCV 201

    Deep Learning for Mobile Mental Health: Challenges and recent advances

    Get PDF
    Mental health plays a key role in everyone’s day-to-day lives, impacting our thoughts, behaviours, and emotions. Also, over the past years, given its ubiquitous and affordable characteristics, the use of smartphones and wearable devices has grown rapidly and provided support within all aspects of mental health research and care, spanning from screening and diagnosis to treatment and monitoring, and attained significant progress to improve remote mental health interventions. While there are still many challenges to be tackled in this emerging cross-discipline research field, such as data scarcity, lack of personalisation, and privacy concerns, it is of primary importance that innovative signal processing and deep learning techniques are exploited. Particularly, recent advances in deep learning can help provide the key enabling technology for the development of the next-generation user-centric mobile mental health applications. In this article, we first brief basic principles associated with mobile device-based mental health analysis, review the main system components, and highlight conventional technologies involved. Next, we describe several major challenges and various deep learning technologies that have potentials for a strong contribution in dealing with these challenges, respectively. Finally, we discuss other remaining problems which need to be addressed via research collaboration across multiple disciplines.This paper has been partially funded by the Bavarian Ministry of Science and Arts as part of the Bavarian Research Association ForDigitHealth, the National Natural Science Foundation of China (Grant No. 62071330, 61702370), and the Key Program of the National Natural Science Foundation of China (Grant No: 61831022)

    Imaging and Dynamics of Light Atoms and Molecules on Graphene

    Full text link
    Observing the individual building blocks of matter is one of the primary goals of microscopy. The invention of the scanning tunneling microscope [1] revolutionized experimental surface science in that atomic-scale features on a solid-state surface could finally be readily imaged. However, scanning tunneling microscopy has limited applicability due to restrictions, for example, in sample conductivity, cleanliness, and data aquisition rate. An older microscopy technique, that of transmission electron microscopy (TEM) [2, 3] has benefited tremendously in recent years from subtle instrumentation advances, and individual heavy (high atomic number) atoms can now be detected by TEM [4 - 7] even when embedded within a semiconductor material [8, 9]. However, detecting an individual low atomic number atom, for example carbon or even hydrogen, is still extremely challenging, if not impossible, via conventional TEM due to the very low contrast of light elements [2, 3, 10 - 12]. Here we demonstrate a means to observe, by conventional transmision electron microscopy, even the smallest atoms and molecules: On a clean single-layer graphene membrane, adsorbates such as atomic hydrogen and carbon can be seen as if they were suspended in free space. We directly image such individual adatoms, along with carbon chains and vacancies, and investigate their dynamics in real time. These techniques open a way to reveal dynamics of more complex chemical reactions or identify the atomic-scale structure of unknown adsorbates. In addition, the study of atomic scale defects in graphene may provide insights for nanoelectronic applications of this interesting material.Comment: 9 pages manuscript and figures, 9 pages supplementary informatio

    Electric-field-induced alignment of electrically neutral disk-like particles: modelling and calculation

    Get PDF
    This work reveals a torque from electric field to electrically neutral flakes that are suspended in a higher electrical conductive matrix. The torque tends to rotate the particles toward an orientation with its long axis parallel to the electric current flow. The alignment enables the anisotropic properties of tiny particles to integrate together and generate desirable macroscale anisotropic properties. The torque was obtained from thermodynamic calculation of electric current free energy at various microstructure configurations. It is significant even when the electrical potential gradient becomes as low as 100 v/m. The changes of electrical, electroplastic and thermal properties during particles alignment were discussed

    From Nonspecific DNA–Protein Encounter Complexes to the Prediction of DNA–Protein Interactions

    Get PDF
    ©2009 Gao, Skolnick. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.doi:10.1371/journal.pcbi.1000341DNA–protein interactions are involved in many essential biological activities. Because there is no simple mapping code between DNA base pairs and protein amino acids, the prediction of DNA–protein interactions is a challenging problem. Here, we present a novel computational approach for predicting DNA-binding protein residues and DNA–protein interaction modes without knowing its specific DNA target sequence. Given the structure of a DNA-binding protein, the method first generates an ensemble of complex structures obtained by rigid-body docking with a nonspecific canonical B-DNA. Representative models are subsequently selected through clustering and ranking by their DNA–protein interfacial energy. Analysis of these encounter complex models suggests that the recognition sites for specific DNA binding are usually favorable interaction sites for the nonspecific DNA probe and that nonspecific DNA–protein interaction modes exhibit some similarity to specific DNA–protein binding modes. Although the method requires as input the knowledge that the protein binds DNA, in benchmark tests, it achieves better performance in identifying DNA-binding sites than three previously established methods, which are based on sophisticated machine-learning techniques. We further apply our method to protein structures predicted through modeling and demonstrate that our method performs satisfactorily on protein models whose root-mean-square Ca deviation from native is up to 5 Å from their native structures. This study provides valuable structural insights into how a specific DNA-binding protein interacts with a nonspecific DNA sequence. The similarity between the specific DNA–protein interaction mode and nonspecific interaction modes may reflect an important sampling step in search of its specific DNA targets by a DNA-binding protein

    QCD corrections to J/ψJ/\psi plus Z0Z^0-boson production at the LHC

    Full text link
    The J/ψ+Z0J/\psi+Z^0 associated production at the LHC is an important process in investigating the color-octet mechanism of non-relativistic QCD in describing the processes involving heavy quarkonium. We calculate the next-to-leading order (NLO) QCD corrections to the J/ψ+Z0J/\psi +Z^0 associated production at the LHC within the factorization formalism of nonrelativistic QCD, and provide the theoretical predictions for the distribution of the J/ψJ/\psi transverse momentum. Our results show that the differential cross section at the leading-order is significantly enhanced by the NLO QCD corrections. We conclude that the LHC has the potential to verify the color-octet mechanism by measuring the J/ψ+Z0J/\psi+Z^0 production events.Comment: 14 page revtex, 5 eps figures, to appear in JHEP. fig5 and the corresponding analysis are correcte
    • …
    corecore