79 research outputs found

    From Sensor Readings to Predictions: On the Process of Developing Practical Soft Sensors.

    Get PDF
    Automatic data acquisition systems provide large amounts of streaming data generated by physical sensors. This data forms an input to computational models (soft sensors) routinely used for monitoring and control of industrial processes, traffic patterns, environment and natural hazards, and many more. The majority of these models assume that the data comes in a cleaned and pre-processed form, ready to be fed directly into a predictive model. In practice, to ensure appropriate data quality, most of the modelling efforts concentrate on preparing data from raw sensor readings to be used as model inputs. This study analyzes the process of data preparation for predictive models with streaming sensor data. We present the challenges of data preparation as a four-step process, identify the key challenges in each step, and provide recommendations for handling these issues. The discussion is focused on the approaches that are less commonly used, while, based on our experience, may contribute particularly well to solving practical soft sensor tasks. Our arguments are illustrated with a case study in the chemical production industry

    Low potency toxins reveal dense interaction networks in metabolism

    Get PDF
    Background The chemicals of metabolism are constructed of a small set of atoms and bonds. This may be because chemical structures outside the chemical space in which life operates are incompatible with biochemistry, or because mechanisms to make or utilize such excluded structures has not evolved. In this paper I address the extent to which biochemistry is restricted to a small fraction of the chemical space of possible chemicals, a restricted subset that I call Biochemical Space. I explore evidence that this restriction is at least in part due to selection again specific structures, and suggest a mechanism by which this occurs. Results Chemicals that contain structures that our outside Biochemical Space (UnBiological groups) are more likely to be toxic to a wide range of organisms, even though they have no specifically toxic groups and no obvious mechanism of toxicity. This correlation of UnBiological with toxicity is stronger for low potency (millimolar) toxins. I relate this to the observation that most chemicals interact with many biological structures at low millimolar toxicity. I hypothesise that life has to select its components not only to have a specific set of functions but also to avoid interactions with all the other components of life that might degrade their function. Conclusions The chemistry of life has to form a dense, self-consistent network of chemical structures, and cannot easily be arbitrarily extended. The toxicity of arbitrary chemicals is a reflection of the disruption to that network occasioned by trying to insert a chemical into it without also selecting all the other components to tolerate that chemical. This suggests new ways to test for the toxicity of chemicals, and that engineering organisms to make high concentrations of materials such as chemical precursors or fuels may require more substantial engineering than just of the synthetic pathways involved

    In silico toxicology protocols

    Get PDF
    The present publication surveys several applications of in silico (i.e., computational) toxicology approaches across different industries and institutions. It highlights the need to develop standardized protocols when conducting toxicity-related predictions. This contribution articulates the information needed for protocols to support in silico predictions for major toxicological endpoints of concern (e.g., genetic toxicity, carcinogenicity, acute toxicity, reproductive toxicity, developmental toxicity) across several industries and regulatory bodies. Such novel in silico toxicology (IST) protocols, when fully developed and implemented, will ensure in silico toxicological assessments are performed and evaluated in a consistent, reproducible, and well-documented manner across industries and regulatory bodies to support wider uptake and acceptance of the approaches. The development of IST protocols is an initiative developed through a collaboration among an international consortium to reflect the state-of-the-art in in silico toxicology for hazard identification and characterization. A general outline for describing the development of such protocols is included and it is based on in silico predictions and/or available experimental data for a defined series of relevant toxicological effects or mechanisms. The publication presents a novel approach for determining the reliability of in silico predictions alongside experimental data. In addition, we discuss how to determine the level of confidence in the assessment based on the relevance and reliability of the information

    Mode of action-based classification and prediction of activity of uncouplers for the screening of chemical inventories

    No full text
    A new approach for classification of uncouplers of oxidative and photophosphorylation, also suitable for screening of large chemical inventories, is introduced. Earlier fragment-based approaches for this mode of toxic action are limited to phenols but weak acids of extremely diverse chemical classes can act as uncouplers. The proposed approach overcomes the limitation to phenolic uncouplers by combining structural fragments with the global information of physico-chemical descriptors. In a top-down approach to reduce the number of candidate chemicals, firstly substructure definitions for the detection of weak acids were applied. Subsequently, conservative physico-chemical thresholds for the two most important properties for the uncoupling activity were defined: an acid dissociation constant (pKa) between 3 and 9, and a sufficiently low energy barrier for the internal permeability of anions (17 kcal/mol). The later was derived from a novel approach to calculate the distribution of compounds across membranes. The combination of structural and physico-chemical criteria allowed a good separation of active from inactive chemicals with high sensitivity (95%) and slightly lower (more than 75%) specificity. Applying this approach to several thousand high and low production volume chemicals retrieved a surprisingly small number of 10 compounds with a predicted excess toxicity above 10. Nevertheless, uncoupling can be an important mode of action as highlighted with several examples ranging from pesticide metabolites to persistent organic compounds
    corecore