5,446 research outputs found
Converging organoids and extracellular matrix::New insights into liver cancer biology
Primary liver cancer, consisting primarily of hepatocellular carcinoma (HCC) and cholangiocarcinoma (CCA), is a heterogeneous malignancy with a dismal prognosis, resulting in the third leading cause of cancer mortality worldwide [1, 2]. It is characterized by unique histological features, late-stage diagnosis, a highly variable mutational landscape, and high levels of heterogeneity in biology and etiology [3-5]. Treatment options are limited, with surgical intervention the main curative option, although not available for the majority of patients which are diagnosed in an advanced stage. Major contributing factors to the complexity and limited treatment options are the interactions between primary tumor cells, non-neoplastic stromal and immune cells, and the extracellular matrix (ECM). ECM dysregulation plays a prominent role in multiple facets of liver cancer, including initiation and progression [6, 7]. HCC often develops in already damaged environments containing large areas of inflammation and fibrosis, while CCA is commonly characterized by significant desmoplasia, extensive formation of connective tissue surrounding the tumor [8, 9]. Thus, to gain a better understanding of liver cancer biology, sophisticated in vitro tumor models need to incorporate comprehensively the various aspects that together dictate liver cancer progression. Therefore, the aim of this thesis is to create in vitro liver cancer models through organoid technology approaches, allowing for novel insights into liver cancer biology and, in turn, providing potential avenues for therapeutic testing. To model primary epithelial liver cancer cells, organoid technology is employed in part I. To study and characterize the role of ECM in liver cancer, decellularization of tumor tissue, adjacent liver tissue, and distant metastatic organs (i.e. lung and lymph node) is described, characterized, and combined with organoid technology to create improved tissue engineered models for liver cancer in part II of this thesis. Chapter 1 provides a brief introduction into the concepts of liver cancer, cellular heterogeneity, decellularization and organoid technology. It also explains the rationale behind the work presented in this thesis. In-depth analysis of organoid technology and contrasting it to different in vitro cell culture systems employed for liver cancer modeling is done in chapter 2. Reliable establishment of liver cancer organoids is crucial for advancing translational applications of organoids, such as personalized medicine. Therefore, as described in chapter 3, a multi-center analysis was performed on establishment of liver cancer organoids. This revealed a global establishment efficiency rate of 28.2% (19.3% for hepatocellular carcinoma organoids (HCCO) and 36% for cholangiocarcinoma organoids (CCAO)). Additionally, potential solutions and future perspectives for increasing establishment are provided. Liver cancer organoids consist of solely primary epithelial tumor cells. To engineer an in vitro tumor model with the possibility of immunotherapy testing, CCAO were combined with immune cells in chapter 4. Co-culture of CCAO with peripheral blood mononuclear cells and/or allogenic T cells revealed an effective anti-tumor immune response, with distinct interpatient heterogeneity. These cytotoxic effects were mediated by cell-cell contact and release of soluble factors, albeit indirect killing through soluble factors was only observed in one organoid line. Thus, this model provided a first step towards developing immunotherapy for CCA on an individual patient level. Personalized medicine success is dependent on an organoids ability to recapitulate patient tissue faithfully. Therefore, in chapter 5 a novel organoid system was created in which branching morphogenesis was induced in cholangiocyte and CCA organoids. Branching cholangiocyte organoids self-organized into tubular structures, with high similarity to primary cholangiocytes, based on single-cell sequencing and functionality. Similarly, branching CCAO obtain a different morphology in vitro more similar to primary tumors. Moreover, these branching CCAO have a higher correlation to the transcriptomic profile of patient-paired tumor tissue and an increased drug resistance to gemcitabine and cisplatin, the standard chemotherapy regimen for CCA patients in the clinic. As discussed, CCAO represent the epithelial compartment of CCA. Proliferation, invasion, and metastasis of epithelial tumor cells is highly influenced by the interaction with their cellular and extracellular environment. The remodeling of various properties of the extracellular matrix (ECM), including stiffness, composition, alignment, and integrity, influences tumor progression. In chapter 6 the alterations of the ECM in solid tumors and the translational impact of our increased understanding of these alterations is discussed. The success of ECM-related cancer therapy development requires an intimate understanding of the malignancy-induced changes to the ECM. This principle was applied to liver cancer in chapter 7, whereby through a integrative molecular and mechanical approach the dysregulation of liver cancer ECM was characterized. An optimized agitation-based decellularization protocol was established for primary liver cancer (HCC and CCA) and paired adjacent tissue (HCC-ADJ and CCA-ADJ). Novel malignancy-related ECM protein signatures were found, which were previously overlooked in liver cancer transcriptomic data. Additionally, the mechanical characteristics were probed, which revealed divergent macro- and micro-scale mechanical properties and a higher alignment of collagen in CCA. This study provided a better understanding of ECM alterations during liver cancer as well as a potential scaffold for culture of organoids. This was applied to CCA in chapter 8 by combining decellularized CCA tumor ECM and tumor-free liver ECM with CCAO to study cell-matrix interactions. Culture of CCAO in tumor ECM resulted in a transcriptome closely resembling in vivo patient tumor tissue, and was accompanied by an increase in chemo resistance. In tumor-free liver ECM, devoid of desmoplasia, CCAO initiated a desmoplastic reaction through increased collagen production. If desmoplasia was already present, distinct ECM proteins were produced by the organoids. These were tumor-related proteins associated with poor patient survival. To extend this method of studying cell-matrix interactions to a metastatic setting, lung and lymph node tissue was decellularized and recellularized with CCAO in chapter 9, as these are common locations of metastasis in CCA. Decellularization resulted in removal of cells while preserving ECM structure and protein composition, linked to tissue-specific functioning hallmarks. Recellularization revealed that lung and lymph node ECM induced different gene expression profiles in the organoids, related to cancer stem cell phenotype, cell-ECM integrin binding, and epithelial-to-mesenchymal transition. Furthermore, the metabolic activity of CCAO in lung and lymph node was significantly influenced by the metastatic location, the original characteristics of the patient tumor, and the donor of the target organ. The previously described in vitro tumor models utilized decellularized scaffolds with native structure. Decellularized ECM can also be used for creation of tissue-specific hydrogels through digestion and gelation procedures. These hydrogels were created from both porcine and human livers in chapter 10. The liver ECM-based hydrogels were used to initiate and culture healthy cholangiocyte organoids, which maintained cholangiocyte marker expression, thus providing an alternative for initiation of organoids in BME. Building upon this, in chapter 11 human liver ECM-based extracts were used in combination with a one-step microfluidic encapsulation method to produce size standardized CCAO. The established system can facilitate the reduction of size variability conventionally seen in organoid culture by providing uniform scaffolding. Encapsulated CCAO retained their stem cell phenotype and were amendable to drug screening, showing the feasibility of scalable production of CCAO for throughput drug screening approaches. Lastly, Chapter 12 provides a global discussion and future outlook on tumor tissue engineering strategies for liver cancer, using organoid technology and decellularization. Combining multiple aspects of liver cancer, both cellular and extracellular, with tissue engineering strategies provides advanced tumor models that can delineate fundamental mechanistic insights as well as provide a platform for drug screening approaches.<br/
Artificial intelligence for predictive biomarker discovery in immuno-oncology: a systematic review
Background: The widespread use of immune checkpoint inhibitors (ICIs) has revolutionised treatment of multiple cancer types. However, selecting patients who may benefit from ICI remains challenging. Artificial intelligence (AI) approaches allow exploitation of high-dimension oncological data in research and development of precision immuno-oncology. Materials and methods: We conducted a systematic literature review of peer-reviewed original articles studying the ICI efficacy prediction in cancer patients across five data modalities: genomics (including genomics, transcriptomics, and epigenomics), radiomics, digital pathology (pathomics), and real-world and multimodality data. Results: A total of 90 studies were included in this systematic review, with 80% published in 2021-2022. Among them, 37 studies included genomic, 20 radiomic, 8 pathomic, 20 real-world, and 5 multimodal data. Standard machine learning (ML) methods were used in 72% of studies, deep learning (DL) methods in 22%, and both in 6%. The most frequently studied cancer type was non-small-cell lung cancer (36%), followed by melanoma (16%), while 25% included pan-cancer studies. No prospective study design incorporated AI-based methodologies from the outset; rather, all implemented AI as a post hoc analysis. Novel biomarkers for ICI in radiomics and pathomics were identified using AI approaches, and molecular biomarkers have expanded past genomics into transcriptomics and epigenomics. Finally, complex algorithms and new types of AI-based markers, such as meta-biomarkers, are emerging by integrating multimodal/multi-omics data. Conclusion: AI-based methods have expanded the horizon for biomarker discovery, demonstrating the power of integrating multimodal data from existing datasets to discover new meta-biomarkers. While most of the included studies showed promise for AI-based prediction of benefit from immunotherapy, none provided high-level evidence for immediate practice change. A priori planned prospective trial designs are needed to cover all lifecycle steps of these software biomarkers, from development and validation to integration into clinical practice
Dataflow Programming and Acceleration of Computationally-Intensive Algorithms
The volume of unstructured textual information continues to grow due to recent technological advancements. This resulted in an exponential growth of information generated in various formats, including blogs, posts, social networking, and enterprise documents. Numerous Enterprise Architecture (EA) documents are also created daily, such as reports, contracts, agreements, frameworks, architecture requirements, designs, and operational guides. The processing and computation of this massive amount of unstructured information necessitate substantial computing capabilities and the implementation of new techniques. It is critical to manage this unstructured information through a centralized knowledge management platform. Knowledge management is the process of managing information within an organization. This involves creating, collecting, organizing, and storing information in a way that makes it easily accessible and usable. The research involved the development textual knowledge management system, and two use cases were considered for extracting textual knowledge from documents. The first case study focused on the safety-critical documents of a railway enterprise. Safety is of paramount importance in the railway industry. There are several EA documents including manuals, operational procedures, and technical guidelines that contain critical information. Digitalization of these documents is essential for analysing vast amounts of textual knowledge that exist in these documents to improve the safety and security of railway operations. A case study was conducted between the University of Huddersfield and the Railway Safety Standard Board (RSSB) to analyse EA safety documents using Natural language processing (NLP). A graphical user interface was developed that includes various document processing features such as semantic search, document mapping, text summarization, and visualization of key trends. For the second case study, open-source data was utilized, and textual knowledge was extracted. Several features were also developed, including kernel distribution, analysis offkey trends, and sentiment analysis of words (such as unique, positive, and negative) within the documents. Additionally, a heterogeneous framework was designed using CPU/GPU and FPGAs to analyse the computational performance of document mapping
Neuronal Spike Shapes (NSS): A straightforward approach to investigate heterogeneity in neuronal excitability states
The mammalian brain exhibits a remarkable diversity of neurons, contributing to its intricate architecture and functional complexity. The analysis of multimodal single-cell datasets enables the investigation of cell types and states heterogeneity. In this study, we introduce the Neuronal Spike Shapes (NSS), a straightforward approach for the exploration of excitability states of neurons based on their Action Potential (AP) waveforms. The NSS method describes the AP waveform based on a triangular representation complemented by a set of derived electrophysiological (EP) features. To support this hypothesis, we validate the proposed approach on two datasets of murine cortical neurons, focusing it on GABAergic neurons. The validation process involves a combination of NSS-based clustering analysis, features exploration, Differential Expression (DE), and Gene Ontology (GO) enrichment analysis. Results show that the NSS-based analysis captures neuronal excitability states that possess biological relevance independently of cell subtype. In particular, Neuronal Spike Shapes (NSS) captures, among others, a well-characterized fast-spiking excitability state, supported by both electrophysiological and transcriptomic validation. Gene Ontology Enrichment Analysis reveals voltage-gated potassium (K+) channels as specific markers of the identified NSS partitions. This finding strongly corroborates the biological relevance of NSS partitions as excitability states, as the expression of voltage-gated K+ channels regulates the hyperpolarization phase of the AP, being directly implicated in the regulation of neuronal excitabilit
Investigation of the metabolism of rare nucleotides in plants
Nucleotides are metabolites involved in primary metabolism, and specialized
metabolism and have a regulatory role in various biochemical reactions in all forms of life. While in other organisms, the nucleotide metabolome was characterized
extensively, comparatively little is known about the cellular concentrations of
nucleotides in plants. The aim of this dissertation was to investigate the nucleotide metabolome and enzymes influencing the composition and quantities of nucleotides in plants. For this purpose, a method for the analysis of nucleotides and nucleosides in plants and algae was developed (Chapter 2.1), which comprises efficient quenching of enzymatic
activity, liquid-liquid extraction and solid phase extraction employing a weak-anionexchange resin. This method allowed the analysis of the nucleotide metabolome of plants in great depth including the quantification of low abundant deoxyribonucleotides and deoxyribonucleosides. The details of the method were summarized in an article, serving as a laboratory protocol (Chapter 2.2).
Furthermore, we contributed a review article (Chapter 2.3) that summarizes the
literature about nucleotide analysis and recent technological advances with a focus on plants and factors influencing and hindering the analysis of nucleotides in plants, i.e., a complex metabolic matrix, highly stable phosphatases and physicochemical
properties of nucleotides. To analyze the sub-cellular concentrations of metabolites, a protocol for the rapid isolation of highly pure mitochondria utilizing affinity chromatography was developed (Chapter 2.4).
The method for the purification of nucleotides furthermore contributed to the
comprehensive analysis of the nucleotide metabolome in germinating seeds and in
establishing seedlings of A. thaliana, with a focus on genes involved in the synthesis of thymidilates (Chapter 2.5) and the characterization of a novel enzyme of purine nucleotide degradation, the XANTHOSINE MONOPHOSPHATE PHOSPHATASE (Chapter 2.6). Protein homology analysis comparing A. thaliana, S. cerevisiae, and H. sapiens led to the identification and characterization of an enzyme involved in the metabolite damage repair system of plants, the INOSINE TRIPHOSPHATE PYROPHOSPHATASE (Chapter 2.7). It was shown that this enzyme dephosphorylates deaminated purine nucleotide triphosphates and thus prevents their incorporation into nucleic acids. Lossof-function mutants senesce early and have a constitutively increased content of salicylic acid. Also, the source of deaminated purine nucleotides in plants was investigated and it was shown that abiotic factors contribute to nucleotide damage.Nukleotide sind Metaboliten, die am Primärstoffwechsel und an spezialisierten
Stoffwechselvorgängen beteiligt sind und eine regulierende Rolle bei verschiedenen
biochemischen Reaktionen in allen Lebensformen spielen. Während bei anderen
Organismen das Nukleotidmetabolom umfassend charakterisiert wurde, ist in Pflanzen
vergleichsweise wenig über die zellulären Konzentrationen von Nukleotiden bekannt.
Ziel dieser Dissertation war es, das Nukleotidmetabolom und die Enzyme zu
untersuchen, die die Zusammensetzung und Menge der Nukleotide in Pflanzen
beeinflussen. Zu diesem Zweck wurde eine Methode zur Analyse von Nukleotiden und
Nukleosiden in Pflanzen und Algen entwickelt (Kapitel 2.1), die ein effizientes Stoppen
enzymatischer Aktivität, eine Flüssig-Flüssig-Extraktion und eine
Festphasenextraktion unter Verwendung eines schwachen Ionenaustauschers
umfasst. Mit dieser Methode konnte das Nukleotidmetabolom von Pflanzen eingehend
analysiert werden, einschließlich der Quantifizierung von Desoxyribonukleotiden und
Desoxyribonukleosiden mit geringer Abundanz. Die Einzelheiten der Methode wurden
in einem Artikel zusammengefasst, der als Laborprotokoll dient (Kapitel 2.2).
Darüber hinaus wurde ein Übersichtsartikel (Kapitel 2.3) verfasst, der die Literatur
über die Analyse von Nukleotiden und die jüngsten technologischen Fortschritte
zusammenfasst. Der Schwerpunkt lag hierbei auf Pflanzen und Faktoren, die die
Analyse von Nukleotiden in Pflanzen beeinflussen oder behindern, d. h. eine komplexe
Matrix, hochstabile Phosphatasen und physikalisch-chemische Eigenschaften von
Nukleotiden.
Um die subzellulären Konzentrationen von Metaboliten zu analysieren, wurde ein
Protokoll für die schnelle Isolierung hochreiner Mitochondrien unter Verwendung einer
Affinitätschromatographie entwickelt (Kapitel 2.4).
Die Methode zur Analyse von Nukleotiden trug außerdem zu einer umfassenden
Analyse des Nukleotidmetaboloms in keimenden Samen und in sich etablierenden
Keimlingen von A. thaliana bei, wobei der Schwerpunkt auf Genen lag, die an der
Synthese von Thymidilaten beteiligt sind (Kapitel 2.5), sowie zu der Charakterisierung
eines neuen Enzyms des Purinnukleotidabbaus, der XANTHOSINE
MONOPHOSPHATE PHOSPHATASE (Kapitel 2.6). Eine Proteinhomologieanalyse, die A. thaliana, S. cerevisiae und H. sapiens
miteinander verglich führte zur Identifizierung und Charakterisierung eines Enzyms,
das an der Reparatur von geschädigten Metaboliten in Pflanzen beteiligt ist, der
INOSINE TRIPHOSPHATE PYROPHOSPHATASE (Kapitel 2.7). Es konnte gezeigt
werden, dass dieses Enzym desaminierte Purinnukleotidtriphosphate
dephosphoryliert und so deren Einbau in Nukleinsäuren verhindert.
Funktionsverlustmutanten altern früh und weisen einen konstitutiv erhöhten Gehalt an Salicylsäure auf. Außerdem wurde die Quelle der desaminierten Purinnukleotide in Pflanzen untersucht, und es wurde gezeigt, dass abiotische Faktoren zur
Nukleotidschädigung beitragen
Genomic insights for safety assessment of foodborne bacteria.
La sicurezza alimentare e l'accesso ad essa sono fondamentali per sostenere la vita e promuovere una buona salute. Gli alimenti non sicuri, contenenti microrganismi o sostanze chimiche nocive, sono causa di oltre 200 malattie, dalla diarrea al cancro, che colpiscono in particolare i neonati, i bambini piccoli, gli anziani e gli individui immunocompromessi. L'onere globale delle malattie di origine alimentare si ripercuote sulla salute pubblica, sulla società e sull'economia, pertanto è necessaria una buona collaborazione tra governi, produttori e consumatori per contribuire a garantire la sicurezza alimentare e sistemi alimentari più solidi. L'indagine più recente condotta dall'OMS (2015) ha evidenziato una stima di 600 milioni di individui malati e 420.000 decessi annui associati ad alimenti non sicuri. L'impatto economico è dovuto principalmente alla mancanza di alimenti sicuri nei Paesi a basso e medio reddito, con una perdita di 110 miliardi di dollari l'anno in termini di produttività e spese mediche. Le sfide principali per garantire la sicurezza alimentare rimangono legate alla nostra produzione alimentare e alla catena di approvvigionamento, dove fattori come la contaminazione ambientale, le preferenze dei consumatori, il rilevamento tempestivo e la sorveglianza dei focolai giocano un ruolo cruciale. Recentemente, le metodologie basate sul DNA per il rilevamento e l'indagine microbica hanno suscitato particolare interesse, soprattutto grazie allo sviluppo delle tecnologie di sequenziamento. Contrariamente ai metodi tradizionali dipendenti dalla coltura, le tecniche basate sul DNA, come il sequenziamento dell'intero genoma (WGS), mirano a risultati rapidi e sensibili a un prezzo relativamente basso e a tempi di elaborazione brevi. Inoltre, il WGS conferisce un elevato potere discriminatorio che consente di determinare importanti caratteristiche genomiche legate alla sicurezza alimentare, come la tassonomia, il potenziale patogeno, la virulenza e la resistenza antimicrobica e il relativo trasferimento genetico. La comprensione di queste caratteristiche è fondamentale per progettare strategie di rilevamento e mitigazione da applicare lungo l'intera catena alimentare secondo una prospettiva di "One Health", che porta ad acquisire conoscenze sul microbiota che influenza l'uomo, gli animali e l'ambiente. Lo scopo della tesi è quello di approfondire la genomica dei microbi di origine alimentare per la loro caratterizzazione e per creare o migliorare le strategie per la loro individuazione e i metodi di mitigazione. In particolare, questa tesi si concentra sulla valutazione del potenziale patogeno sulla base di analisi genomiche che includono studi di tassonomia, virulenza, resistenza agli antibiotici e mobiloma. Il secondo obiettivo è quello di trarre vantaggio dalle conoscenze genomiche per progettare dispositivi di rilevamento rapidi ed efficaci e metodi di mitigazione affidabili per affrontare i patogeni di origine alimentare. Più in dettaglio, saranno trattati i seguenti argomenti: La presenza di ceppi multiresistenti negli alimenti fermentati pronti al consumo rappresenta un rischio per la salute pubblica per la diffusione di determinanti AMR nella catena alimentare e nel microbiota intestinale dei consumatori. Le analisi genomiche hanno permesso di valutare accuratamente la sicurezza del ceppo UC7251 di E. faecium, in relazione alla sua virulenza e alla co-localizzazione dei geni di resistenza agli antibiotici e ai metalli pesanti in elementi mobili con capacità di coniugazione in diverse matrici. Questo lavoro sottolinea l'importanza di una sorveglianza della presenza di batteri AMR negli alimenti e di incitare lo sviluppo di strategie innovative per la mitigazione del rischio legato alla diffusione della resistenza antimicrobica negli alimenti. L'accuratezza dell'identificazione tassonomica guida le analisi successive e, per questo motivo, un metodo adeguato per identificare le specie è fondamentale. È stata studiata la riclassificazione delle specie di Enterococcus faecium clade B, utilizzando un approccio combinato di filogenomica, tipizzazione di sequenza multilocus, identità nucleotidica media e ibridazione digitale DNA-DNA. L'obiettivo è dimostrare come l'analisi del genoma sia più efficace e fornisca risultati più dettagliati riguardo alla definizione delle specie, rispetto all'analisi della sequenza del 16S rRNA. Ciò ha portato alla proposta di riclassificare tutto il clade B di E. faecium come E. lactis, riconoscendo che i due gruppi sono filogeneticamente separati, per cui è possibile definire una specifica procedura di valutazione della sicurezza, prima del loro utilizzo negli alimenti o come probiotici, compresa la considerazione per l'inclusione nella lista europea QPS.
A partire da questa riclassificazione tassonomica, abbiamo sviluppato un metodo basato sulla PCR per la rapida individuazione e differenziazione di queste due specie e per discutere le principali differenze fenotipiche e genotipiche da una prospettiva clinica. A questo scopo, è stato utilizzato un allineamento del core-genoma basato sull'analisi del pangenoma. La differenza allelica tra alcuni geni del core ha permesso la progettazione di primer e l'identificazione della specie mediante PCR con una specificità del 100% e senza reattività crociata. Inoltre, i genomi clinici di E. lactis sono stati classificati come un rischio potenziale a causa della capacità di aumentare la traslocazione batterica. Gli agenti antimicrobici alternativi agli antibiotici sono una delle principali aree di sviluppo e miglioramento dell'attuale catena alimentare. Le nanoparticelle metalliche, come le nanoparticelle di platino (PtNPs), hanno suscitato interesse per le loro potenti attività catalitiche simili alle ossidasi e alle perossidasi che garantiscono forti effetti antimicrobici, e sono state proposte come potenziali candidati per superare gli inconvenienti degli antibiotici come la resistenza ai farmaci. L'obiettivo è studiare la modalità d'azione delle PtNPs in relazione alla capacità di formazione del biofilm, al meccanismo di contrasto delle specie reattive dell'ossigeno (ROS) e al quorum sensing utilizzando batteri di origine alimentare come Enterococcus faecium e Salmonella Typhimurium.Safe food and the access to it is key to sustaining life and promoting good health. Unsafe food containing harmful microorganisms or chemical substances causes more than 200 diseases, ranging from diarrhoea to cancers that particularly affect infants, young children, elderly and immunocompromised individuals. The global burden of foodborne disease affects public health, society, and economy, therefore good collaboration between governments, producers and consumers is needed to help ensure food safety and stronger food systems. The most recent survey conducted by WHO (2015) showed an estimated 600 million ill individuals and 420 000 yearly deaths associated to unsafe food. The economic impact is mainly due to the lack of safe food in low and middle income causing a US$ 110 billion is lost each year in productivity and medical expenses. The main challenges to assure food safety remain tied to our food production and supply chain, where factors like environmental contamination, consumer preferences, timely detection and surveillance of outbreaks play a crucial role. Recently, DNA-based methodologies for microbial detection and investigation have sparked special interest, mainly linked to the development of sequencing technologies. Contrary to the traditional culture-dependent methods, DNA-based techniques such as Whole Genome Sequencing (WGS) that targets fast and sensitive results at a relative low price and short processing time. Moreover, WGS confers high discriminatory power that allows to determine important genomic characteristics linked to food safety like taxonomy, pathogenic potential, virulence and antimicrobial resistance and the genetic transfer thereof. The understanding of these characteristics is fundamental to design detection and mitigation strategies to apply along the entire food-chain following a ‘One Health’ perspective, leading to gain knowledge about the microbiota that affect humans, animals, and environment.
The aim of the thesis is to gain insight into the genomics of foodborne microbes for their characterization and to create or improve strategies for their detection and mitigation methods. Particularly, this thesis is focused on the assessment of the pathogenic potential based on genomic analyses including taxonomy, virulence, antibiotic resistance and mobilome studies. The second focus is to profit from the genomic insights to design rapid and time-effective detection devices and reliable mitigation methods to tackle foodborne pathogens.
In more detail the following topics will be handled:
The presence of multi-drug resistant strains in ready-to-eat fermented food represents a risk of public health for the spread of AMR determinants in the food chain and in the gut microbiota of consumers. Genomic analyses permitted to accurately assess the safety of E. faecium strain UC7251, with respect to its virulence and co-location of antibiotic and heavy metal resistance genes in mobile elements with conjugation capacity in different matrices. This work emphasizes the importance of a surveillance for the presence of AMR bacteria in food and to incite the development of innovative strategies for the mitigation of the risk related to antimicrobial resistance diffusion in food.
The accuracy of taxonomic identification drives the subsequent analysis and, for this reason, a suitable method to identify species is crucial. The species re-classification of Enterococcus faecium clade B was investigated, using a combined approach of phylogenomics, multilocus sequence typing, average nucleotide identity and digital DNA–DNA hybridization. The goal is to show how the genome analysis is more effective and give more detailed results concerning the species definition, respect to the analysis of the 16S rRNA sequence. This led to the proposal to reclassify all the E. faecium clade B as E. lactis, recognizing the two groups are phylogenetically separate, where a specific safety assessment procedure can be designed, before their use in food or as probiotics, including the consideration for inclusion in the European QPS list.
From this taxonomic re-classification, we developed a PCR-based method for rapid detection and differentiation of these two species and to discuss main phenotypic and genotypic differences from a clinical perspective. To this aim, core-genome alignment base on pangenome analysis was used. Allelic difference between certain core genes allowed primer design and species identification through PCR with 100% specificity and no cross-reactivity. Moreover, clinical E. lactis genomes categorised as a potential risk due to the ability of enhanced bacterial translocation.
Antimicrobial agents alternative to antibiotics are one of the main areas of development and improvement in the current food chain. Metallic nanoparticles like Platinum nanoparticles (PtNPs), have awaken interest due to their potent catalytic activities similar to oxidases and peroxidases granting strong antimicrobial effects, have been proposed as potential candidates to overcome the drawbacks of antibiotics like drug resistance. The goal is to study the mode of action of PtNPs related to biofilm formation capacity, reactive oxygen species (ROS) coping mechanism and quorum sensing using foodborne bacteria like Enterococcus faecium and Salmonella Typhimurium
Prediction of Cytotoxicity Related PubChem Assays Using High-Content-Imaging Descriptors derived from Cell-Painting
The pharmaceutical industry is centred around small molecules and their effects. Apart from the curative effect, the absence of adverse or toxicological effects is cardinal. However, toxicity is at least as elusive as it is important. A simple definition is: ’toxicology is the science of adverse effects of chemicals on living organisms’.1 However, this definition comprises several caveats. What is the organism? Where do therapeutic and adverse effects start and end? Even for the simplest organisms’ toxicity, cytotoxicity, the mechanisms are manifold and difficult to unravel. Hence, it remains obscure which characteristics a compound has to combine to be labelled as toxic. One attempt to illuminate these characteristics are novel cell-painting (CP) assays. For a CP assay, cells are perturbed by libraries of small compounds, which might affect the cellular morphology before images are taken via automated fluorescence microscopy. Five fluorescent channels are used for imaging, and these channels correspond to certain cell organelles.2 Therefore CP data contains information about cell structure variations caused by each compound. Which
subinformation is actually valuable within these morphological fingerprints remains elusive. Therefore a significant part of the project presented here is dedicated to exploring the CP data and their predictive capabilities comparatively. They will be compared against different descriptors for a variety of bioassays. The CP data used in this project contains roughly 30 000 compounds and 1800 features.3
In chemistry, the structure determines the properties of a compound or substance. Therefore, apart from CP, structural fingerprints are used as a benchmark descriptor set for comparison. In this project extended-connectivity fingerprints (ECFPs) were used to encode the compounds’ structures as numerical features.
This work is concerned with morphological changes that correspond to toxicity. Thus, the CP data were combined with toxicological endpoints from specific assays selected from the PubChem database. The selection process implemented a minimum number of active compounds, a size criterion and the occurrence of toxicologically relevant targets. After the selected assays were combined with each of their descriptors, machine learning models were trained, and their predictive power was evaluated against specific metrics. The predictions can be divided into four cycles. In the first cycle, the CP data are used as descriptors, the second cycle used the structural fingerprints, and the third cycle used a subset of both. A rigorous feature engineering process selected the subsets. The last cycle skipped the feature engineering and combined all CP and ECFP descriptors into one large set of inputs. The evaluation of the prediction metrics illuminates which strengths and shortcomings the morphological fingerprints feature compared to the structural fingerprints. It turned out that there are two groups of assays: those PubChem assays that are generally better predicted with CP features and those that have higher predictive potential when using ECFP. Additionally, it was revealed that ECFP comprise higher specificity compared to CP data which show higher sensitivity on the other hand. A high sensitivity means the prediction rarely mislabels a sample as negative (e.g. non-toxic) compared to the
number of correctly labelled positive samples (e.g. toxic compounds.). Based on these results, CP is better suited for toxicity prediction and drug safety evaluations since the mislabelled, positive compound can lead to expenses or even damage to health. Furthermore, based on the data from fluorescent channels, an enrichment measure was introduced and calculated for the aforementioned two groups of PubChem assays. This enrichment connects predictive performance with cell organelle activity. The hypothesis was that PubChem assays, reliably predictable from CP data, should exhibit increased enrichment, which was the case for four out of five fluorescence microscopy channels. As a next step, phenotypic terms were manually generated to categorize the different PubChem assays. These terms corresponded to cellular mechanisms or morphological processes and were generated unbiasedly. Nevertheless, they are subject to human error. The phenotypic annotations that are found to be enriched for successful modelling approaches might guide the preselection of bioassays in future projects. The enrichment analysis of phenotypic annotations detected that PubChem assays that could be well predicted via CP data are related to immune response, genotoxicity and genome regulation and cell death.
Finally, the assays are assigned gene ontology (GO) terms obtained from the GO database. These terms comprise a controlled, structured vocabulary that explicitly describes the molecular function and biological processes of a given gene product. For PubChem assays associated with a protein target, the GO terms are collected. If an assay is particularly well predicted via CP descriptors, the associated GO terms can relate this finding to cellular function. Even though the analysis with go terms suffers from a minimal sample size, it was found that CP related assays usually correspond to processes concerning deoxyribonucleic acid (DNA) and other macromolecules. This finding is in good agreement with the analysis of the channel
enrichment as well as the phenotypic enrichment
Attributes in Cloud Service Descriptions : A comprehensive Content Analysis
The exponential growth of cloud services can make it challenging for customers to find the best available service. This problem is further aggregated by not comprehensive and non-standardized service descriptions on cloud providers’ websites. This issue has not yet been adequately researched. In response to this gap and following the call (Lehner & Floerecke, 2023) to analyse IT service catalogues directed toward external customers, the purpose of this work is to examine the attribute usage in customer-facing service descriptions available on providers’ websites. A literature review thereby identified 76 different attributes used for cloud service description. Although there are a vast number of attributes used for cloud service descriptions, a core of attributes that were named in most papers, could be detected. In a following step, a content analysis of 100 service descriptions available on cloud providers’ websites was performed to understand, how frequently each attribute was used in the cloud service description from Cloud providers in general and also differentiated by size, cloud service model (IaaS, PaaS, SaaS), and geographical location of the provider. The majority of attributes of the literature review could thereby be found in the content analysis as well. 15 more attributes have been added to the initial list as they could not be matched to any of the attributes from the literature. In addition, it could be verified that criteria such as size, service model, and geographical location have a significant impact on the attribute usage for service descriptions. Finally, expert interviews were conducted to get additional insights. The consent of the expert is that the main purpose of cloud service descriptions available on cloud providers’ websites is not necessarily to inform customers, but to attract and convince them. The insights of this work can provide valuable information to customers as well as cloud providers to understand, which attributes are currently used or not used for cloud service descriptions on provider’s websites. This research provides valuable information for both customers and cloud providers by identifying which attributes are currently used or not used for cloud service descriptions and can serve as a foundation for further research
- …