3,205 research outputs found

    Ruthenium metallotherapeutics: a targeted approach to combatting multidrug resistant pathogens

    Get PDF
    The discovery of antibiotics revolutionised healthcare practice. However due to overuse, inappropriate use, widespread prophylaxis therapy and the lack of new developments, the threat of antimicrobial resistance is now a major global threat to health. By 2050, it is estimated that mortality due to antimicrobial resistant infections will exceed 10 million people per annum, superseding cancer as the leading cause of global mortality. The use of drug repurposing to identify potential therapies which combat antimicrobial resistance is one potential solution. Metals have been used as antimicrobial agents throughout the history of medicine for a broad range of applications, including the use of Silver as an antimicrobial agent which dates back to antiquity. More recently, Ruthenium metallotherapeutic complexes have been shown to exhibit highly active antimicrobial properties by targeting a range of bacterial species, and in contrast to traditional antibiotics, these compounds are thought to elicit antibacterial activity at multiple sites within the bacterial cell, which may reduce the possibility of resistance evolution. This study aimed to evaluate the antimicrobial activity of a series of Ruthenium metallotherapeutic complexes against multidrug-resistant bacterial pathogens, with a focus on use within wound care applications. Antimicrobial susceptibility assays identified two lead candidates, Hexaammineruthenium (III) chloride and [Chlorido(η6-p-cymene)(N-(4-chlorophenyl)pyridine-2-carbothioamide) ruthenium (II)] chloride which demonstrated activity against Pseudomonas aeruginosa and Staphylococcus aureus respectively with MIC values ranging between 4 μg mL-1 and 16 μg mL-1. Furthermore, Hexaammineruthenium (III) chloride demonstrated antibiofilm activity in both a time and concentration-dependent manner. Synergy studies combining lead complexes with antibiotics demonstrated the potential for use as resistance breakers. Subsequent in vitro infection modelling using scratch assays with skin cell lines, coupled with a 3D full thickness skin wound infection model was used to determine potential applied applications of Hexaammineruthenium (III) chloride for use as topical antimicrobial agent against P. aeruginosa infections. Antimicrobial mechanistic studies demonstrated that Hexaammineruthenium (III) chloride targeted the bacterial cell ultrastructure of P. aeruginosa strain PAO1 as cell perturbations were observed when treated cells were analysed by scanning electron microscopy. Furthermore, exposure of P. aeruginosa PAO1 to Hexaammineruthenium (III) chloride also resulted in a concentration dependent membrane depolarisation, which further supported the antimicrobial mechanistic role. Finally, global changes in gene expression following exposure of P. aeruginosa strain PAO1 to Hexaammineruthenium (III) chloride were explored by RNA sequencing. Genes involved in ribosome function, cofactor biosynthesis and membrane fusion were downregulated, which provided a further insight into the wider mechanisms of antibacterial activity. The research conducted in the present study indicated the potential use of Hexaammineruthenium (III) chloride (and derivatives) as a potential treatment option for chronic wounds infected with P. aeruginosa, which could be applied as either a direct treatment or used within antimicrobial wound care applications

    Serverless Strategies and Tools in the Cloud Computing Continuum

    Full text link
    Tesis por compendio[ES] En los últimos años, la popularidad de la computación en nube ha permitido a los usuarios acceder a recursos de cómputo, red y almacenamiento sin precedentes bajo un modelo de pago por uso. Esta popularidad ha propiciado la aparición de nuevos servicios para resolver determinados problemas informáticos a gran escala y simplificar el desarrollo y el despliegue de aplicaciones. Entre los servicios más destacados en los últimos años se encuentran las plataformas FaaS (Función como Servicio), cuyo principal atractivo es la facilidad de despliegue de pequeños fragmentos de código en determinados lenguajes de programación para realizar tareas específicas en respuesta a eventos. Estas funciones son ejecutadas en los servidores del proveedor Cloud sin que los usuarios se preocupen de su mantenimiento ni de la gestión de su elasticidad, manteniendo siempre un modelo de pago por uso de grano fino. Las plataformas FaaS pertenecen al paradigma informático conocido como Serverless, cuyo propósito es abstraer la gestión de servidores por parte de los usuarios, permitiéndoles centrar sus esfuerzos únicamente en el desarrollo de aplicaciones. El problema del modelo FaaS es que está enfocado principalmente en microservicios y tiende a tener limitaciones en el tiempo de ejecución y en las capacidades de computación (por ejemplo, carece de soporte para hardware de aceleración como GPUs). Sin embargo, se ha demostrado que la capacidad de autoaprovisionamiento y el alto grado de paralelismo de estos servicios pueden ser muy adecuados para una mayor variedad de aplicaciones. Además, su inherente ejecución dirigida por eventos hace que las funciones sean perfectamente adecuadas para ser definidas como pasos en flujos de trabajo de procesamiento de archivos (por ejemplo, flujos de trabajo de computación científica). Por otra parte, el auge de los dispositivos inteligentes e integrados (IoT), las innovaciones en las redes de comunicación y la necesidad de reducir la latencia en casos de uso complejos han dado lugar al concepto de Edge computing, o computación en el borde. El Edge computing consiste en el procesamiento en dispositivos cercanos a las fuentes de datos para mejorar los tiempos de respuesta. La combinación de este paradigma con la computación en nube, formando arquitecturas con dispositivos a distintos niveles en función de su proximidad a la fuente y su capacidad de cómputo, se ha acuñado como continuo de la computación en la nube (o continuo computacional). Esta tesis doctoral pretende, por lo tanto, aplicar diferentes estrategias Serverless para permitir el despliegue de aplicaciones generalistas, empaquetadas en contenedores de software, a través de los diferentes niveles del continuo computacional. Para ello, se han desarrollado múltiples herramientas con el fin de: i) adaptar servicios FaaS de proveedores Cloud públicos; ii) integrar diferentes componentes software para definir una plataforma Serverless en infraestructuras privadas y en el borde; iii) aprovechar dispositivos de aceleración en plataformas Serverless; y iv) facilitar el despliegue de aplicaciones y flujos de trabajo a través de interfaces de usuario. Además, se han creado y adaptado varios casos de uso para evaluar los desarrollos conseguidos.[CA] En els últims anys, la popularitat de la computació al núvol ha permès als usuaris accedir a recursos de còmput, xarxa i emmagatzematge sense precedents sota un model de pagament per ús. Aquesta popularitat ha propiciat l'aparició de nous serveis per resoldre determinats problemes informàtics a gran escala i simplificar el desenvolupament i desplegament d'aplicacions. Entre els serveis més destacats en els darrers anys hi ha les plataformes FaaS (Funcions com a Servei), el principal atractiu de les quals és la facilitat de desplegament de petits fragments de codi en determinats llenguatges de programació per realitzar tasques específiques en resposta a esdeveniments. Aquestes funcions són executades als servidors del proveïdor Cloud sense que els usuaris es preocupen del seu manteniment ni de la gestió de la seva elasticitat, mantenint sempre un model de pagament per ús de gra fi. Les plataformes FaaS pertanyen al paradigma informàtic conegut com a Serverless, el propòsit del qual és abstraure la gestió de servidors per part dels usuaris, permetent centrar els seus esforços únicament en el desenvolupament d'aplicacions. El problema del model FaaS és que està enfocat principalment a microserveis i tendeix a tenir limitacions en el temps d'execució i en les capacitats de computació (per exemple, no té suport per a maquinari d'acceleració com GPU). Tot i això, s'ha demostrat que la capacitat d'autoaprovisionament i l'alt grau de paral·lelisme d'aquests serveis poden ser molt adequats per a més aplicacions. A més, la seva inherent execució dirigida per esdeveniments fa que les funcions siguen perfectament adequades per ser definides com a passos en fluxos de treball de processament d'arxius (per exemple, fluxos de treball de computació científica). D'altra banda, l'auge dels dispositius intel·ligents i integrats (IoT), les innovacions a les xarxes de comunicació i la necessitat de reduir la latència en casos d'ús complexos han donat lloc al concepte d'Edge computing, o computació a la vora. L'Edge computing consisteix en el processament en dispositius propers a les fonts de dades per millorar els temps de resposta. La combinació d'aquest paradigma amb la computació en núvol, formant arquitectures amb dispositius a diferents nivells en funció de la proximitat a la font i la capacitat de còmput, s'ha encunyat com a continu de la computació al núvol (o continu computacional). Aquesta tesi doctoral pretén, doncs, aplicar diferents estratègies Serverless per permetre el desplegament d'aplicacions generalistes, empaquetades en contenidors de programari, a través dels diferents nivells del continu computacional. Per això, s'han desenvolupat múltiples eines per tal de: i) adaptar serveis FaaS de proveïdors Cloud públics; ii) integrar diferents components de programari per definir una plataforma Serverless en infraestructures privades i a la vora; iii) aprofitar dispositius d'acceleració a plataformes Serverless; i iv) facilitar el desplegament d'aplicacions i fluxos de treball mitjançant interfícies d'usuari. A més, s'han creat i s'han adaptat diversos casos d'ús per avaluar els desenvolupaments aconseguits.[EN] In recent years, the popularity of Cloud computing has allowed users to access unprecedented compute, network, and storage resources under a pay-per-use model. This popularity led to new services to solve specific large-scale computing challenges and simplify the development and deployment of applications. Among the most prominent services in recent years are FaaS (Function as a Service) platforms, whose primary appeal is the ease of deploying small pieces of code in certain programming languages to perform specific tasks on an event-driven basis. These functions are executed on the Cloud provider's servers without users worrying about their maintenance or elasticity management, always keeping a fine-grained pay-per-use model. FaaS platforms belong to the computing paradigm known as Serverless, which aims to abstract the management of servers from the users, allowing them to focus their efforts solely on the development of applications. The problem with FaaS is that it focuses on microservices and tends to have limitations regarding the execution time and the computing capabilities (e.g. lack of support for acceleration hardware such as GPUs). However, it has been demonstrated that the self-provisioning capability and high degree of parallelism of these services can be well suited to broader applications. In addition, their inherent event-driven triggering makes functions perfectly suitable to be defined as steps in file processing workflows (e.g. scientific computing workflows). Furthermore, the rise of smart and embedded devices (IoT), innovations in communication networks and the need to reduce latency in challenging use cases have led to the concept of Edge computing. Edge computing consists of conducting the processing on devices close to the data sources to improve response times. The coupling of this paradigm together with Cloud computing, involving architectures with devices at different levels depending on their proximity to the source and their compute capability, has been coined as Cloud Computing Continuum (or Computing Continuum). Therefore, this PhD thesis aims to apply different Serverless strategies to enable the deployment of generalist applications, packaged in software containers, across the different tiers of the Cloud Computing Continuum. To this end, multiple tools have been developed in order to: i) adapt FaaS services from public Cloud providers; ii) integrate different software components to define a Serverless platform on on-premises and Edge infrastructures; iii) leverage acceleration devices on Serverless platforms; and iv) facilitate the deployment of applications and workflows through user interfaces. Additionally, several use cases have been created and adapted to assess the developments achieved.Risco Gallardo, S. (2023). Serverless Strategies and Tools in the Cloud Computing Continuum [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/202013Compendi

    The ins and outs of open-angle Glaucoma:drugs, diet, and defecation

    Get PDF
    Glaucoma is the leading cause of irreversible blindness and second leading cause of blindness. The primary aim of this thesis is to provide insight into the role of systemic effectsin the pathophysiology of OAG.<br/

    Precision Surface Processing and Software Modelling Using Shear-Thickening Polishing Slurries

    Get PDF
    Mid-spatial frequency surface error is a known manufacturing defect for aspherical and freeform precision surfaces. These surface ripples decrease imaging contrast and system signal-to-noise ratio. Existing sub-aperture polishing techniques are limited in their abilities to smooth mid-spatial frequency errors. Shear-thickening slurries have been hypothesised to reduce mid-spatial frequency errors on precision optical surfaces by increasing the viscosity at the tool-part interface. Currently, controlling the generation and mitigating existing mid-spatial frequency surface errors for aspherical and freeform surfaces requires extensive setup and the experience of seasoned workers. This thesis reports on the experimental trials of shear-thickening polishing slurries on glass surfaces. By incorporating shear-thickening slurries with the precessed bonnet technology, the aim is to enhance the ability of the precessions technology in mitigating mid-spatial frequency errors. The findings could facilitate a more streamlined manufacturing chain for precision optics for the versatile precessions technology from form correction and texture improvement, to MSF mitigation, without needing to rely on other polishing technologies. Such improvement on the existing bonnet polishing would provide a vital steppingstone towards building a fully autonomous manufacturing cell in a market of continual economic growth. The experiments in this thesis analysed the capabilities of two shear-thickening slurry systems: (1) polyethylene glycol with silica nanoparticle suspension, and (2) water and cornstarch suspension. Both slurry systems demonstrated the ability at mitigating existing surface ripples. Looking at power spectral density graphs, polyethylene glycol slurries reduced the power of the mid-spatial frequencies by ~50% and cornstarch suspension slurries by 60-90%. Experiments of a novel polishing approach are also reported in this thesis to rotate a precessed bonnet at a predetermined working distance above the workpiece surface. The rapidly rotating tool draws in the shear-thickening slurry through the gap to stiffen the fluid for polishing. This technique demonstrated material removal capabilities using cornstarch suspension slurries at a working distance of 1.0-1.5mm. The volumetric removal rate from this process is ~5% of that of contact bonnet polishing, so this aligns more as a finishing process. This polishing technique was given the term rheological bonnet finishing. The rheological properties of cornstarch suspension slurries were tested using a rheometer and modelled through CFD simulation. Using the empirical rheological data, polishing simulations of the rheological bonnet finishing process were modelled in Ansys to analyse the effects of various input parameters such as working distance, tool headspeed, precess angle, and slurry viscosity

    Natural and Technological Hazards in Urban Areas

    Get PDF
    Natural hazard events and technological accidents are separate causes of environmental impacts. Natural hazards are physical phenomena active in geological times, whereas technological hazards result from actions or facilities created by humans. In our time, combined natural and man-made hazards have been induced. Overpopulation and urban development in areas prone to natural hazards increase the impact of natural disasters worldwide. Additionally, urban areas are frequently characterized by intense industrial activity and rapid, poorly planned growth that threatens the environment and degrades the quality of life. Therefore, proper urban planning is crucial to minimize fatalities and reduce the environmental and economic impacts that accompany both natural and technological hazardous events

    Spatial Distribution of Meso- and Microplastics in Floodplain Soilscapes: Novel Insights from Rural to Urban Floodplains in Central Germany

    Get PDF
    Plastics and especially microplastics have become an emerging threat to global ecosystems. Despite the manifold benefits and applications of the human-made material plastic, the uncontrolled release of plastics into the environment has led to a “global plastic crisis”. During the last decades it becomes apparent that this crisis leads to the presence of plastics within different environments including marine, aquatic and terrestrial systems under worldwide evidence. Furthermore, environmental plastic research was able to reveal that although plastic often ends up in oceans, the majority of plastics in the environment are transported as part of a “global plastic cycle” from the land to sea via river systems. Those river systems are not isolated in the landscape, but rather a part of an “aquatic-terrestrial interface” which also encompasses floodplains and their soilscapes. The present thesis focuses on the spatial distribution and spatio-temporal accumulation of meso- and microplastics in floodplain soilscapes following the overall objective to unravel the role of floodplain soilscapes as depositional areas of plastics within the global plastic cycle. In this context, a number of individual contributions have been published, reaching from conceptual spatial research approaches, over case studies conducted within two different floodplain soilscapes, to further opinions on the scientific benefit of plastic residues in floodplain soils. The individual contributions are linked by the major hypothesis that floodplain soilscapes act as temporal accumulation sites for plastics, driven by flood-related processes and land use over the last 70 years. To proof this major hypothesis and to overcome the lack of spatial reference in microplastics research, a geospatial sampling approach was conducted. Initial spatial data on meso- and microplastics in floodplain soils were obtained by a holistic analysis approach including the analysis of basic soil feature and metal analysis, the quantification of meso- and microplastics as well as sediment dating. Within both studied river floodplains geospatial sampling enables a detection of meso- and microplastics over the entire floodplain area and within the entire soil column reaching depths of two meters. Additionally, a frequent accumulation of plastics was found within the upper 50 cm of floodplain soils. In combination with dating of near-channel floodplain sites, it could be demonstrated that those plastic accumulations are related to recent sedimentary deposits since the 1960s. However, evidence of plastic from deeper soil layers suggests that vertical displacements in floodplain soils occur and that plastics become mobilized. Furthermore, the presence of plastics in upstream areas suggests that plastics are released to river systems and deposited via flood dynamics already in rural areas. Additionally it appears that anthropogenic impacts, such as tillage or floodplain restoration influence plastic distributions. The findings of this thesis clarify that floodplain soilscapes are part of the global plastic cycle as temporally depositional areas of plastics, but raising further questions on the mobility of plastics in soils and about the exact contribution of different environmental drivers towards plastic deposition. Finally, the present thesis indicates that the spatial reference of environmental plastic research should be rethought, in order to understand the spatial dynamics of plastics within the aquatic-terrestrial interface

    Digital Innovations for a Circular Plastic Economy in Africa

    Get PDF
    Plastic pollution is one of the biggest challenges of the twenty-first century that requires innovative and varied solutions. Focusing on sub-Saharan Africa, this book brings together interdisciplinary, multi-sectoral and multi-stakeholder perspectives exploring challenges and opportunities for utilising digital innovations to manage and accelerate the transition to a circular plastic economy (CPE). This book is organised into three sections bringing together discussion of environmental conditions, operational dimensions and country case studies of digital transformation towards the circular plastic economy. It explores the environment for digitisation in the circular economy, bringing together perspectives from practitioners in academia, innovation, policy, civil society and government agencies. The book also highlights specific country case studies in relation to the development and implementation of different innovative ideas to drive the circular plastic economy across the three sub-Saharan African regions. Finally, the book interrogates the policy dimensions and practitioner perspectives towards a digitally enabled circular plastic economy. Written for a wide range of readers across academia, policy and practice, including researchers, students, small and medium enterprises (SMEs), digital entrepreneurs, non-governmental organisations (NGOs) and multilateral agencies, policymakers and public officials, this book offers unique insights into complex, multilayered issues relating to the production and management of plastic waste and highlights how digital innovations can drive the transition to the circular plastic economy in Africa. The Open Access version of this book, available at https://www.taylorfrancis.com, has been made available under a Creative Commons Attribution-Non Commercial-No Derivatives (CC-BY-NC-ND) 4.0 license

    ENGINEERING HIGH-RESOLUTION EXPERIMENTAL AND COMPUTATIONAL PIPELINES TO CHARACTERIZE HUMAN GASTROINTESTINAL TISSUES IN HEALTH AND DISEASE

    Get PDF
    In recent decades, new high-resolution technologies have transformed how scientists study complex cellular processes and the mechanisms responsible for maintaining homeostasis and the emergence and progression of gastrointestinal (GI) disease. These advances have paved the way for the use of primary human cells in experimental models which together can mimic specific aspects of the GI tract such as compartmentalized stem-cell zones, gradients of growth factors, and shear stress from fluid flow. The work presented in this dissertation has focused on integrating high-resolution bioinformatics with novel experimental models of the GI epithelium systems to describe the complexity of human pathophysiology of the human small intestines, colon, and stomach in homeostasis and disease. Here, I used three novel microphysiological systems and developed four computational pipelines to describe comprehensive gene expression patterns of the GI epithelium in various states of health and disease. First, I used single cell RNAseq (scRNAseq) to establish the transcriptomic landscape of the entire epithelium of the small intestine and colon from three human donors, describing cell-type specific gene expression patterns in high resolution. Second, I used single cell and bulk RNAseq to model intestinal absorption of fatty acids and show that fatty acid oxidation is a critical regulator of the flux of long- and medium-chain fatty acids across the epithelium. Third, I use bulk RNAseq and a machine learning model to describe how inflammatory cytokines can regulate proliferation of intestinal stem cells in an experimental model of inflammatory hypoxia. Finally, I developed a high throughput platform that can associate phenotype to gene expression in clonal organoids, providing unprecedented resolution into the relationship between comprehensive gene expression patterns and their accompanying phenotypic effects. Through these studies, I have demonstrated how the integration of computational and experimental approaches can measurably advance our understanding of human GI physiology.Doctor of Philosoph

    Efficient Deep Learning for Real-time Classification of Astronomical Transients

    Get PDF
    A new golden age in astronomy is upon us, dominated by data. Large astronomical surveys are broadcasting unprecedented rates of information, demanding machine learning as a critical component in modern scientific pipelines to handle the deluge of data. The upcoming Legacy Survey of Space and Time (LSST) of the Vera C. Rubin Observatory will raise the big-data bar for time- domain astronomy, with an expected 10 million alerts per-night, and generating many petabytes of data over the lifetime of the survey. Fast and efficient classification algorithms that can operate in real-time, yet robustly and accurately, are needed for time-critical events where additional resources can be sought for follow-up analyses. In order to handle such data, state-of-the-art deep learning architectures coupled with tools that leverage modern hardware accelerators are essential. The work contained in this thesis seeks to address the big-data challenges of LSST by proposing novel efficient deep learning architectures for multivariate time-series classification that can provide state-of-the-art classification of astronomical transients at a fraction of the computational costs of other deep learning approaches. This thesis introduces the depthwise-separable convolution and the notion of convolutional embeddings to the task of time-series classification for gains in classification performance that are achieved with far fewer model parameters than similar methods. It also introduces the attention mechanism to time-series classification that improves performance even further still, with significant improvement in computational efficiency, as well as further reduction in model size. Finally, this thesis pioneers the use of modern model compression techniques to the field of photometric classification for efficient deep learning deployment. These insights informed the final architecture which was deployed in a live production machine learning system, demonstrating the capability to operate efficiently and robustly in real-time, at LSST scale and beyond, ready for the new era of data intensive astronomy
    corecore