18 research outputs found

    When we can trust computers (and when we can't)

    Get PDF
    With the relentless rise of computer power, there is a widespread expectation that computers can solve the most pressing problems of science, and even more besides. We explore the limits of computational modelling and conclude that, in the domains of science and engineering which are relatively simple and firmly grounded in theory, these methods are indeed powerful. Even so, the availability of code, data and documentation, along with a range of techniques for validation, verification and uncertainty quantification, are essential for building trust in computer-generated findings. When it comes to complex systems in domains of science that are less firmly grounded in theory, notably biology and medicine, to say nothing of the social sciences and humanities, computers can create the illusion of objectivity, not least because the rise of big data and machine-learning pose new challenges to reproducibility, while lacking true explanatory power. We also discuss important aspects of the natural world which cannot be solved by digital means. In the long term, renewed emphasis on analogue methods will be necessary to temper the excessive faith currently placed in digital computation. This article is part of the theme issue 'Reliability and reproducibility in computational science: implementing verification, validation and uncertainty quantification in silico'

    Big data need big theory too.

    Get PDF
    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'

    Pandemic Drugs at Pandemic Speed: Infrastructure for Accelerating COVID-19 Drug Discovery with Hybrid Machine Learning- and Physics-based Simulations on High Performance Computers

    Get PDF
    The race to meet the challenges of the global pandemic has served as a reminder that the existing drug discovery process is expensive, inefficient and slow. There is a major bottleneck screening the vast number of potential small molecules to shortlist lead compounds for antiviral drug development. New opportunities to accelerate drug discovery lie at the interface between machine learning methods, in this case, developed for linear accelerators, and physics-based methods. The two in silico methods, each have their own advantages and limitations which, interestingly, complement each other. Here, we present an innovative infrastructural development that combines both approaches to accelerate drug discovery. The scale of the potential resulting workflow is such that it is dependent on supercomputing to achieve extremely high throughput. We have demonstrated the viability of this workflow for the study of inhibitors for four COVID-19 target proteins and our ability to perform the required large-scale calculations to identify lead antiviral compounds through repurposing on a variety of supercomputers

    From digital hype to analogue reality: Universal simulation beyond the quantum and exascale eras

    No full text
    Many believe that the future of innovation lies in simulation. However, as computers are becoming ever more powerful, so does the hyperbole used to discuss their potential in modelling across a vast range of domains, from subatomic physics to chemistry, climate science, epidemiology, economics and cosmology. As we are about to enter the era of quantum and exascale computing, machine learning and artificial intelligence have entered the field in a significant way. In this article we give a brief history of simulation, discuss how machine learning can be more powerful if underpinned by deeper mechanistic understanding, outline the potential of exascale and quantum computing, highlight the limits of digital computing – classical and quantum – and distinguish rhetoric from reality in assessing the future of modelling and simulation, when we believe analogue computing will play an increasingly important role

    Determination of the magnetic penetration depth of the high T<sub>c</sub> superconductor YBa<sub>2</sub>Cu<sub>3</sub>O<sub>7-x</sub> by polarised neutron reflection

    No full text
    Flecher and co-workers in their work on superconducting films niobium, lead and lead-bismuth, have used the reflection of spin-polarized slow neutrons to obtain a direct and absolute measurement of the magnetic penetration depth. In this paper, the authors report the first such measurement of the penetration depth in a sample of yBa//2Cu//3O//7// minus //x. At a temperature of 4. 8 K and in an applied magnetic field of 350 oersteds they obtain a value, which represents an upper limit, of 225 plus or minus 75 Angstroms, which is small compared with penetration depths in conventional superconductors, and with the recently quoted values for YBa//2Cu//3O//7// minus //x and La//1//. //8//5Ba//0//. //1//5CuO//4
    corecore