2,380 research outputs found

    Agent-based modeling: a systematic assessment of use cases and requirements for enhancing pharmaceutical research and development productivity.

    Get PDF
    A crisis continues to brew within the pharmaceutical research and development (R&D) enterprise: productivity continues declining as costs rise, despite ongoing, often dramatic scientific and technical advances. To reverse this trend, we offer various suggestions for both the expansion and broader adoption of modeling and simulation (M&S) methods. We suggest strategies and scenarios intended to enable new M&S use cases that directly engage R&D knowledge generation and build actionable mechanistic insight, thereby opening the door to enhanced productivity. What M&S requirements must be satisfied to access and open the door, and begin reversing the productivity decline? Can current methods and tools fulfill the requirements, or are new methods necessary? We draw on the relevant, recent literature to provide and explore answers. In so doing, we identify essential, key roles for agent-based and other methods. We assemble a list of requirements necessary for M&S to meet the diverse needs distilled from a collection of research, review, and opinion articles. We argue that to realize its full potential, M&S should be actualized within a larger information technology framework--a dynamic knowledge repository--wherein models of various types execute, evolve, and increase in accuracy over time. We offer some details of the issues that must be addressed for such a repository to accrue the capabilities needed to reverse the productivity decline

    Evolutionary Computation and QSAR Research

    Get PDF
    [Abstract] The successful high throughput screening of molecule libraries for a specific biological property is one of the main improvements in drug discovery. The virtual molecular filtering and screening relies greatly on quantitative structure-activity relationship (QSAR) analysis, a mathematical model that correlates the activity of a molecule with molecular descriptors. QSAR models have the potential to reduce the costly failure of drug candidates in advanced (clinical) stages by filtering combinatorial libraries, eliminating candidates with a predicted toxic effect and poor pharmacokinetic profiles, and reducing the number of experiments. To obtain a predictive and reliable QSAR model, scientists use methods from various fields such as molecular modeling, pattern recognition, machine learning or artificial intelligence. QSAR modeling relies on three main steps: molecular structure codification into molecular descriptors, selection of relevant variables in the context of the analyzed activity, and search of the optimal mathematical model that correlates the molecular descriptors with a specific activity. Since a variety of techniques from statistics and artificial intelligence can aid variable selection and model building steps, this review focuses on the evolutionary computation methods supporting these tasks. Thus, this review explains the basic of the genetic algorithms and genetic programming as evolutionary computation approaches, the selection methods for high-dimensional data in QSAR, the methods to build QSAR models, the current evolutionary feature selection methods and applications in QSAR and the future trend on the joint or multi-task feature selection methods.Instituto de Salud Carlos III, PIO52048Instituto de Salud Carlos III, RD07/0067/0005Ministerio de Industria, Comercio y Turismo; TSI-020110-2009-53)Galicia. ConsellerĂ­a de EconomĂ­a e Industria; 10SIN105004P

    Overview on the Recent Drugs Delivery Approaches

    Get PDF
    This review provides the reader a concise overview of the different biological barriers that hinder the delivery of therapeutic agents through membranes, such as intestinal mucosa, Brain Blood Barrier (BBB), and mediators of transport such as efflux transporters and etc., and the approaches for overcoming such barriers. The approaches discussed in this review include: utilizing natural occurring transporters to deliver drugs specifically to their targets, nucleoside analogues delivery, CYPactivated prodrugs that target drugs to the liver, modification of passive diffusion by efflux pumps, intestinal transporters such as PEPT1 and GLUT1, Carrier Mediated Transport (CMT) systems for transporting nutrients, vitamins or hormones into the central nervous system, tissue selective drug delivery, administration of an exogenous enzyme to reach the tumor site which is followed by systemic administration of non-toxic prodrugs (ADEPT, GDEPT and VDEPT), enzymes involve in the bioconversion of ester-based prodrugs for activation (hydrolysis) of prodrugs to their active forms, brain targeted Chemical Delivery Systems (CDS), amino acid prodrugs to improve oral bioavailability, sustained drug delivery and intravenous drug delivery. In addition, Receptor-Mediated Transcytosis (RMT) for efficacious delivery of Nano particles through the intestinal mucosa and BBB, and the prodrug chemical approach based on intra molecularity to deliver anti-cancer drugs is discussed

    Strategies for creating new informational primitives in minds and machines

    Get PDF
    Open-endedness is an important goal for designing systems that can autonomously find new and expected solutions to combinatorically-complex and ill-defined problems. Classically, issues of open-ended generation of novelty in the universe have come under the rubric of the problem of emergence. We distinguish two modes of creating novelty: combinatoric (new combinations of existing primitive

    Annotated Bibliography: Anticipation

    Get PDF

    Omnipresent Maxwell’s demons orchestrate information management in living cells

    Get PDF
    The development of synthetic biology calls for accurate understanding of the critical functions that allow construction and operation of a living cell. Besides coding for ubiquitous structures, minimal genomes encode a wealth of functions that dissipate energy in an unanticipated way. Analysis of these functions shows that they are meant to manage information under conditions when discrimination of substrates in a noisy background is preferred over a simple recognition process. We show here that many of these functions, including transporters and the ribosome construction machinery, behave as would behave a material implementation of the informationmanaging agent theorized by Maxwell almost 150 years ago and commonly known as Maxwell’s demon (MxD). A core gene set encoding these functions belongs to the minimal genome required to allow the construction of an autonomous cell. These MxDs allow the cell to perform computations in an energy-efficient way that is vastly better than our contemporary computers
    • …
    corecore