19,470 research outputs found

    Development of models for predicting Torsade de Pointes cardiac arrhythmias using perceptron neural networks

    Full text link
    Blockage of some ion channels and in particular, the hERG cardiac potassium channel delays cardiac repolarization and can induce arrhythmia. In some cases it leads to a potentially life-threatening arrhythmia known as Torsade de Pointes (TdP). Therefore recognizing drugs with TdP risk is essential. Candidate drugs that are determined not to cause cardiac ion channel blockage are more likely to pass successfully through clinical phases II and III trials (and preclinical work) and not be withdrawn even later from the marketplace due to cardiotoxic effects. The objective of the present study is to develop an SAR model that can be used as an early screen for torsadogenic (causing TdP arrhythmias) potential in drug candidates. The method is performed using descriptors comprised of atomic NMR chemical shifts and corresponding interatomic distances which are combined into a 3D abstract space matrix. The method is called 3D-SDAR (3 dimensional spectral data-activity relationship) and can be interrogated to identify molecular features responsible for the activity, which can in turn yield simplified hERG toxicophores. A dataset of 55 hERG potassium channel inhibitors collected from Kramer et al. consisting of 32 drugs with TdP risk and 23 with no TdP risk was used for training the 3D-SDAR model.An ANN model with multilayer perceptron was used to define collinearities among the independent 3D-SDAR features. A composite model from 200 random iterations with 25% of the molecules in each case yielded the following figures of merit: training, 99.2 %; internal test sets, 66.7%; external (blind validation) test set, 68.4%. In the external test set, 70.3% of positive TdP drugs were correctly predicted. Moreover, toxicophores were generated from TdP drugs. A 3D-SDAR was successfully used to build a predictive model for drug-induced torsadogenic and non-torsadogenic drugs.Comment: Accepted for publication in BMC Bioinformatics (Springer) July 201

    Microbiological contamination models for use in risk assessment during pharmaceutical production

    Get PDF
    This paper describes the fundamental mechanisms of microbial contamination during manufacture of pharmaceutical products. Models are derived that describe air and surface contact contamination. These models can be used to develop and improve methods of microbial risk assessment. The use of the FMEA (FMECA) method of risk assessment is discussed and, when used with the correct risk factors, its use endorsed

    Predicting Skin Permeability by means of Computational Approaches : Reliability and Caveats in Pharmaceutical Studies

    Get PDF
    © 2019 American Chemical Society.The skin is the main barrier between the internal body environment and the external one. The characteristics of this barrier and its properties are able to modify and affect drug delivery and chemical toxicity parameters. Therefore, it is not surprising that permeability of many different compounds has been measured through several in vitro and in vivo techniques. Moreover, many different in silico approaches have been used to identify the correlation between the structure of the permeants and their permeability, to reproduce the skin behavior, and to predict the ability of specific chemicals to permeate this barrier. A significant number of issues, like interlaboratory variability, experimental conditions, data set building rationales, and skin site of origin and hydration, still prevent us from obtaining a definitive predictive skin permeability model. This review wants to show the main advances and the principal approaches in computational methods used to predict this property, to enlighten the main issues that have arisen, and to address the challenges to develop in future research.Peer reviewedFinal Accepted Versio

    Scanning electron microscopy image representativeness: morphological data on nanoparticles.

    Get PDF
    A sample of a nanomaterial contains a distribution of nanoparticles of various shapes and/or sizes. A scanning electron microscopy image of such a sample often captures only a fragment of the morphological variety present in the sample. In order to quantitatively analyse the sample using scanning electron microscope digital images, and, in particular, to derive numerical representations of the sample morphology, image content has to be assessed. In this work, we present a framework for extracting morphological information contained in scanning electron microscopy images using computer vision algorithms, and for converting them into numerical particle descriptors. We explore the concept of image representativeness and provide a set of protocols for selecting optimal scanning electron microscopy images as well as determining the smallest representative image set for each of the morphological features. We demonstrate the practical aspects of our methodology by investigating tricalcium phosphate, Ca3 (PO4 )2 , and calcium hydroxyphosphate, Ca5 (PO4 )3 (OH), both naturally occurring minerals with a wide range of biomedical applications

    Modeling reactivity to biological macromolecules with a deep multitask network

    Get PDF
    Most small-molecule drug candidates fail before entering the market, frequently because of unexpected toxicity. Often, toxicity is detected only late in drug development, because many types of toxicities, especially idiosyncratic adverse drug reactions (IADRs), are particularly hard to predict and detect. Moreover, drug-induced liver injury (DILI) is the most frequent reason drugs are withdrawn from the market and causes 50% of acute liver failure cases in the United States. A common mechanism often underlies many types of drug toxicities, including both DILI and IADRs. Drugs are bioactivated by drug-metabolizing enzymes into reactive metabolites, which then conjugate to sites in proteins or DNA to form adducts. DNA adducts are often mutagenic and may alter the reading and copying of genes and their regulatory elements, causing gene dysregulation and even triggering cancer. Similarly, protein adducts can disrupt their normal biological functions and induce harmful immune responses. Unfortunately, reactive metabolites are not reliably detected by experiments, and it is also expensive to test drug candidates for potential to form DNA or protein adducts during the early stages of drug development. In contrast, computational methods have the potential to quickly screen for covalent binding potential, thereby flagging problematic molecules and reducing the total number of necessary experiments. Here, we train a deep convolution neural networkthe XenoSite reactivity modelusing literature data to accurately predict both sites and probability of reactivity for molecules with glutathione, cyanide, protein, and DNA. On the site level, cross-validated predictions had area under the curve (AUC) performances of 89.8% for DNA and 94.4% for protein. Furthermore, the model separated molecules electrophilically reactive with DNA and protein from nonreactive molecules with cross-validated AUC performances of 78.7% and 79.8%, respectively. On both the site- and molecule-level, the model’s performances significantly outperformed reactivity indices derived from quantum simulations that are reported in the literature. Moreover, we developed and applied a selectivity score to assess preferential reactions with the macromolecules as opposed to the common screening traps. For the entire data set of 2803 molecules, this approach yielded totals of 257 (9.2%) and 227 (8.1%) molecules predicted to be reactive only with DNA and protein, respectively, and hence those that would be missed by standard reactivity screening experiments. Site of reactivity data is an underutilized resource that can be used to not only predict if molecules are reactive, but also show where they might be modified to reduce toxicity while retaining efficacy. The XenoSite reactivity model is available at http://swami.wustl.edu/xenosite/p/reactivity

    Tastes and odours in potable water:perception versus reality

    Get PDF
    Tastes and odours are amongst the few water quality standards immediately apparent to a consumer and, as a result, account for most consumer complaints about water quality. Although taste and odour problems can arise from a great many sources, from an operational point of view they are either ”predictable” or ”unpredictable”. The former - which include problems related to actinomycete and algal growth - have a tendency to occur in certain types of water under certain combinations of conditions, whereas the latter - typically chemical spills - can occur anywhere. Long-term control is one option for predictable problems, although biomanipulation on a large scale has had utile success. Detection and avoidance is a more practicable option for both predictable and unpredictable problems, particularly if the distribution network can be serviced from other sources. Where these are not feasible, then water treatment, typically using activated carbon, is possible. In general there is a reasonable understanding of what compounds cause taste and odour problems, and how to treat these. An efficient taste and odour control programme therefore relies ultimately on good management of existing resources. However, a number of problems lie outside the remit of water supply companies and will require more fundamental regulation of activities in the catchment

    Exploring the potential of 3D Zernike descriptors and SVM for protein\u2013protein interface prediction

    Get PDF
    Abstract Background The correct determination of protein–protein interaction interfaces is important for understanding disease mechanisms and for rational drug design. To date, several computational methods for the prediction of protein interfaces have been developed, but the interface prediction problem is still not fully understood. Experimental evidence suggests that the location of binding sites is imprinted in the protein structure, but there are major differences among the interfaces of the various protein types: the characterising properties can vary a lot depending on the interaction type and function. The selection of an optimal set of features characterising the protein interface and the development of an effective method to represent and capture the complex protein recognition patterns are of paramount importance for this task. Results In this work we investigate the potential of a novel local surface descriptor based on 3D Zernike moments for the interface prediction task. Descriptors invariant to roto-translations are extracted from circular patches of the protein surface enriched with physico-chemical properties from the HQI8 amino acid index set, and are used as samples for a binary classification problem. Support Vector Machines are used as a classifier to distinguish interface local surface patches from non-interface ones. The proposed method was validated on 16 classes of proteins extracted from the Protein–Protein Docking Benchmark 5.0 and compared to other state-of-the-art protein interface predictors (SPPIDER, PrISE and NPS-HomPPI). Conclusions The 3D Zernike descriptors are able to capture the similarity among patterns of physico-chemical and biochemical properties mapped on the protein surface arising from the various spatial arrangements of the underlying residues, and their usage can be easily extended to other sets of amino acid properties. The results suggest that the choice of a proper set of features characterising the protein interface is crucial for the interface prediction task, and that optimality strongly depends on the class of proteins whose interface we want to characterise. We postulate that different protein classes should be treated separately and that it is necessary to identify an optimal set of features for each protein class
    • …
    corecore