8 research outputs found

    Otter-Knowledge: benchmarks of multimodal knowledge graph representation learning from different sources for drug discovery

    Full text link
    Recent research in representation learning utilizes large databases of proteins or molecules to acquire knowledge of drug and protein structures through unsupervised learning techniques. These pre-trained representations have proven to significantly enhance the accuracy of subsequent tasks, such as predicting the affinity between drugs and target proteins. In this study, we demonstrate that by incorporating knowledge graphs from diverse sources and modalities into the sequences or SMILES representation, we can further enrich the representation and achieve state-of-the-art results on established benchmark datasets. We provide preprocessed and integrated data obtained from 7 public sources, which encompass over 30M triples. Additionally, we make available the pre-trained models based on this data, along with the reported outcomes of their performance on three widely-used benchmark datasets for drug-target binding affinity prediction found in the Therapeutic Data Commons (TDC) benchmarks. Additionally, we make the source code for training models on benchmark datasets publicly available. Our objective in releasing these pre-trained models, accompanied by clean data for model pretraining and benchmark results, is to encourage research in knowledge-enhanced representation learning

    A recommender for the management of chronic pain in patients undergoing spinal cord stimulation

    Full text link
    Spinal cord stimulation (SCS) is a therapeutic approach used for the management of chronic pain. It involves the delivery of electrical impulses to the spinal cord via an implanted device, which when given suitable stimulus parameters can mask or block pain signals. Selection of optimal stimulation parameters usually happens in the clinic under the care of a provider whereas at-home SCS optimization is managed by the patient. In this paper, we propose a recommender system for the management of pain in chronic pain patients undergoing SCS. In particular, we use a contextual multi-armed bandit (CMAB) approach to develop a system that recommends SCS settings to patients with the aim of improving their condition. These recommendations, sent directly to patients though a digital health ecosystem, combined with a patient monitoring system closes the therapeutic loop around a chronic pain patient over their entire patient journey. We evaluated the system in a cohort of SCS-implanted ENVISION study subjects (Clinicaltrials.gov ID: NCT03240588) using a combination of quality of life metrics and Patient States (PS), a novel measure of holistic outcomes. SCS recommendations provided statistically significant improvement in clinical outcomes (pain and/or QoL) in 85\% of all subjects (N=21). Among subjects in moderate PS (N=7) prior to receiving recommendations, 100\% showed statistically significant improvements and 5/7 had improved PS dwell time. This analysis suggests SCS patients may benefit from SCS recommendations, resulting in additional clinical improvement on top of benefits already received from SCS therapy

    Development of efficient data assimilation methods for solute transport problems

    Get PDF
    The solution of marine contaminant transport problems is a significant research topic in civil engineering. Typically, the problem is represented as a partial differential equation described by the non-stationary, advection-diffusion operator. The underlying equation is approximated in space and time, and the state of the approximate numerical model is obtained as a solution of the corresponding linear algebraic system. However, for several reasons, numerical models do not necessarily replicate the process investigated exactly. In fact, their application and deployment generate modelling errors and discrepancies, a well-known and challenging problem to solve, which cannot be ignored in practice. At the same time, the rapid development of measuring devices allows easier collection of data of the physical process. Typically, observations are prone to be contaminated by errors too, those generated by noise or other physical reasons. Moreover, observations are usually quite sparse in time and space. The combination and compromise between a numerical model and observations are integrated using data assimilation techniques. The development of efficient methods of data assimilation techniques in terms of estimation quality and computation speed is the main concern of this research. The traditional algorithms of data assimilation such as minimax or Kalman filters are often used to quantify uncertainties represented by the model and observation errors. They construct an analysis state and propagate in time, taking into account model dynamics and observed information. The numerical algorithms of these filters are computationally expensive as they require multiplication and inversion of matrices of the size equal to the number of degrees of freedom of the system. Moreover, traditional filters are not scalable with respect to the number of discretisation nodes. In this research, a combination of traditional filters with domain decomposition techniques are investigated to assess reduction of computational costs. The application of decomposition to the assimilation problem facilitates the reformulation of the global problem as a set of local subproblems coupled by continuity or transmission conditions. To solve the decomposed assimilation problem, two new approaches are considered. The first one discretises the transmission conditions directly and yields a system of differential-algebraic equations. The latter is solved by using a modified version of the minimax filter. The second approach imposes transmission conditions into the variational formulation of the local subproblems. A set of local differential problems is solved by the iterative method of Schwarz. This approach is further extended to Kalman and ensemble filters using their equivalence with the minimax filter. The efficiency of the proposed methods is examined using numerical experiments with different configurations including simulations with constant velocity field, periodic velocity field and velocity field generated by TELEMAC 2D for a tidal basin. The quality of estimates of the localised filters is assessed against both traditional filters and true solutions. The computational efficiency of the localised filters is evaluated, compared to the existing methods and discussed. Finally, scalability properties of the proposed algorithms are presented

    Practical perfusion quantification in multispectral endoscopic video: using the minutes after ICG administration to assess tissue pathology

    No full text
    The wide availability of near infrared light sources in interventional medical imaging stacks enables non-invasive quantification of perfusion by using fluorescent dyes, typically Indocyanine Green (ICG). Due to their often leaky and chaotic vasculatures, intravenously administered ICG perfuses through cancerous tissues differently. We investigate here how a few characteristic values derived from the time series of fluorescence can be used in simple machine learning algorithms to distinguish benign lesions from cancers. These features capture the initial uptake of ICG in the colon, its peak fluorescence, and its early wash-out. By using simple, explainable algorithms we demonstrate, in clinical cases, that sensitivity (specificity) rates of over 95% (95%) for cancer classification can be achieved
    corecore