13 research outputs found

    Multi-State RNA Design with Geometric Multi-Graph Neural Networks

    Full text link
    Computational RNA design has broad applications across synthetic biology and therapeutic development. Fundamental to the diverse biological functions of RNA is its conformational flexibility, enabling single sequences to adopt a variety of distinct 3D states. Currently, computational biomolecule design tasks are often posed as inverse problems, where sequences are designed based on adopting a single desired structural conformation. In this work, we propose gRNAde, a geometric RNA design pipeline that operates on sets of 3D RNA backbone structures to explicitly account for and reflect RNA conformational diversity in its designs. We demonstrate the utility of gRNAde for improving native sequence recovery over single-state approaches on a new large-scale 3D RNA design dataset, especially for multi-state and structurally diverse RNAs. Our code is available at https://github.com/chaitjo/geometric-rna-desig

    Benchmarking Generated Poses: How Rational is Structure-based Drug Design with Generative Models?

    Full text link
    Deep generative models for structure-based drug design (SBDD), where molecule generation is conditioned on a 3D protein pocket, have received considerable interest in recent years. These methods offer the promise of higher-quality molecule generation by explicitly modelling the 3D interaction between a potential drug and a protein receptor. However, previous work has primarily focused on the quality of the generated molecules themselves, with limited evaluation of the 3D molecule \emph{poses} that these methods produce, with most work simply discarding the generated pose and only reporting a "corrected" pose after redocking with traditional methods. Little is known about whether generated molecules satisfy known physical constraints for binding and the extent to which redocking alters the generated interactions. We introduce PoseCheck, an extensive analysis of multiple state-of-the-art methods and find that generated molecules have significantly more physical violations and fewer key interactions compared to baselines, calling into question the implicit assumption that providing rich 3D structure information improves molecule complementarity. We make recommendations for future research tackling identified failure modes and hope our benchmark can serve as a springboard for future SBDD generative modelling work to have a real-world impact

    Data-driven discovery of molecular photoswitches with multioutput Gaussian processes

    Get PDF
    Photoswitchable molecules display two or more isomeric forms that may be accessed using light. Separating the electronic absorption bands of these isomers is key to selectively addressing a specific isomer and achieving high photostationary states whilst overall red-shifting the absorption bands serves to limit material damage due to UV-exposure and increases penetration depth in photopharmacological applications. Engineering these properties into a system through synthetic design however, remains a challenge. Here, we present a data-driven discovery pipeline for molecular photoswitches underpinned by dataset curation and multitask learning with Gaussian processes. In the prediction of electronic transition wavelengths, we demonstrate that a multioutput Gaussian process (MOGP) trained using labels from four photoswitch transition wavelengths yields the strongest predictive performance relative to single-task models as well as operationally outperforming time-dependent density functional theory (TD-DFT) in terms of the wall-clock time for prediction. We validate our proposed approach experimentally by screening a library of commercially available photoswitchable molecules. Through this screen, we identified several motifs that displayed separated electronic absorption bands of their isomers, exhibited red-shifted absorptions, and are suited for information transfer and photopharmacological applications. Our curated dataset, code, as well as all models are made available at https://github.com/Ryan-Rhys/The-Photoswitch-Dataset

    Ethoscopes: An open platform for high-throughput <i>ethomics</i>

    No full text
    <div><p>Here, we present the use of ethoscopes, which are machines for high-throughput analysis of behavior in <i>Drosophila</i> and other animals. Ethoscopes provide a software and hardware solution that is reproducible and easily scalable. They perform, in real-time, tracking and profiling of behavior by using a supervised machine learning algorithm, are able to deliver behaviorally triggered stimuli to flies in a feedback-loop mode, and are highly customizable and open source. Ethoscopes can be built easily by using 3D printing technology and rely on Raspberry Pi microcomputers and Arduino boards to provide affordable and flexible hardware. All software and construction specifications are available at <a href="http://lab.gilest.ro/ethoscope" target="_blank">http://lab.gilest.ro/ethoscope</a>.</p></div

    The ethoscope platform.

    No full text
    <p>(A) A diagram of the typical setup. Ethoscopes, powered through a USB adapter, are connected in an intranet mesh through an AP or a Wi-Fi router. A computer in the network acts as the node, receiving data from ethoscopes and serving a web-UI through which ethoscopes can be controlled, either locally or remotely. (B) Screenshot of the homepage of the web-UI, showing a list of running machines and some associated experimental metadata (e.g., username and location). (C) Screenshot of an ethoscope control page on the web-UI, providing metadata about the experiment and a real-time updated snapshot from the ethoscope point of view. AP, access point; GMT, Greenwich mean time; FPS, frames per second; USB, universal serial bus; web UI, web-based user interface.</p

    Versatility of use with custom behavioral arenas.

    No full text
    <p>(A-H) Examples of 8 different behavioral arenas whose files for 3D printing are available on the ethoscope website. (A) Sleep arena. Most commonly used arena in our laboratory for sleep studies, lodging 20 individual tubes. (B) Long tubes arena. It houses 13-cm tubes and can be used for odor delivery studies or, more generally, for behaviors requiring more space. (C) Food bullet arena. Animals are placed directly on the arena and food can be replaced by pushing in a new bullet [<a href="http://www.plosbiology.org/article/info:doi/10.1371/journal.pbio.2003026#pbio.2003026.ref011" target="_blank">11</a>]. It does not require glass tubes and can be used for quick administration of chemicals in the food. (D) Decision making arena. It can be used to study simple decision making behaviors, adapted from Hirsch [<a href="http://www.plosbiology.org/article/info:doi/10.1371/journal.pbio.2003026#pbio.2003026.ref003" target="_blank">3</a>]. (E) Square wells arena. It can be used for courtship assay or to record activity in a bidimensional environment. (F, G) Conceptually analogous to A and I, but designed to work in high-resolution (full-HD) settings. (H) Round wells arena, modelled following specifications from Simon and Dickinson [<a href="http://www.plosbiology.org/article/info:doi/10.1371/journal.pbio.2003026#pbio.2003026.ref022" target="_blank">22</a>]. Note that all arenas are marked with 3 visible reference points (indicated by a red circle in A) that are used by the ethoscope to automatically define regions of interest for tracking. HD, high-definition.</p

    The ethoscope.

    No full text
    <p>(A) Exploded drawing of an archetypal ethoscope. The machine is composed of 2 main parts: an upper case housing the rPi and its camera, and a lower case providing diffused infrared light illumination and support for the experimental arena. The 2 cases are separated by spacers maintaining a fixed focal distance (140 mm for rPi camera 1.0). (B) A rendered drawing of the assembled model, showing the actual size without cables. The presence of USB and connection cables will slightly increase the total size (cables not shown for simplicity). The arena slides in place through guides and locks into position. A webGL interactive 3D model is available as <a href="http://www.plosbiology.org/article/info:doi/10.1371/journal.pbio.2003026#pbio.2003026.s001" target="_blank">S1 Fig</a>. (C) The LEGOscope, a version of the ethoscope built using LEGO bricks. A detailed instruction manual is provided in <a href="http://www.plosbiology.org/article/info:doi/10.1371/journal.pbio.2003026#pbio.2003026.s002" target="_blank">S1 Text</a>. (D) The PAPERscope, a paper and cardboard version of the ethoscope, best assembled using 220 gsm paper and 1 mm gray board. Blueprints are provided in <a href="http://www.plosbiology.org/article/info:doi/10.1371/journal.pbio.2003026#pbio.2003026.s003" target="_blank">S2 Text</a>. In all cases, ethoscopes must be powered with a 5 V DC input using a common USB micro cable either connected to the main or to a portable power-pack. DC, direct current; HD, high-definition; LED, light-emitting diode; rPi, Raspberry Pi; USB, universal serial bus.</p

    Versatility of use with behavioral feedback-loop modules.

    No full text
    <p>(A) Diagram and (B) detail of the AGO-delivery module. Two independent flows (blue and purple in the drawing) are fed into the module using external sources. The module features 10 LEGO pneumatic valves, each independently controlled through a servo motor. The motor switches the air source on the valve, selecting which source will be relayed to the tube containing the fly. Available positions are blue source, purple source, and closed. (C) Representative response of 3 flies subjected to CO<sub>2</sub> administration using the AGO module. CO<sub>2</sub> release lasts 5 seconds (grey bar) and it is triggered by midline crossing (red dot). The blue line indicates the fly position in the tube over the 150 second period. (D) Model and (E) detail of the rotational module. The module employs a servo motor to turn the tube hosting the fly. The direction, speed, duration, and angle of the rotation can be modulated to change the quality of the stimulus. (F) Representative response of 3 flies upon stimulation using the rotational module shown in (D, E). Rotation of the tube is triggered by 20 consecutive seconds of immobility (dashed line) and is followed by 5 seconds of masking, during which tracking is suspended to avoid motion artefacts (cyan area). The bottom panel shows traces of a dead fly. (G) Model of the optomotor module able to simultaneously stimulate single flies with rotational motion and light. (H) Detailed view of the optomotor minimal unit. Light is directed into the tube using optical fiber. <a href="http://www.plosbiology.org/article/info:doi/10.1371/journal.pbio.2003026#pbio.2003026.s007" target="_blank">S2 Movie</a> shows the optomotor module in action. (I-P) The servo module employed for a sleep deprivation experiment. Flies shown in grey are unstimulated mock controls, never experiencing tube rotations. Flies shown in light blue experience rotation either after 20 seconds of inactivity (I-L) or after midline crossing (M-P). (J, N) Sleep profile of flies along 3 days in conditions of 12 hour:12 hour, light and dark cycles. Gray shadings indicated the stimulation period and the following sleep rebound period. (K, O) Number of tube rotations delivered during the 12-hour stimulation period. (L, P) Quantification of sleep rebound during the first 3 hours of the day following the stimulation. AGO, air/gas/odor; DC, direct current; LED, light-emitting diode; rpm, revolutions per minute; SD, secure digital; ZT, Zeitgeber time.</p
    corecore