3,982 research outputs found

    Investigating the dynamics of Greenland's glacier-fjord systems

    Get PDF
    Over the past two decades, Greenland’s tidewater glaciers have dramatically retreated, thinned and accelerated, contributing significantly to sea level rise. This change in glacier behaviour is thought to have been triggered by increasing atmospheric and ocean temperatures, and mass loss from Greenland’s tidewater glaciers is predicted to continue this century. Substantial research during this period of rapid glacier change has improved our understanding of Greenland’s glacier-fjord systems. However, many of the processes operating in these systems that ultimately control the response of tidewater glaciers to changing atmospheric and oceanic conditions are poorly understood. This thesis combines modelling and remote sensing to investigate two particularly poorly-understood components of glacier-fjord systems, with the ultimate aim of improving understanding of recent glacier behaviour and constraining the stability of the ice sheet in a changing climate. The research presented in this thesis begins with an investigation into the dominant controls on the seasonal dynamics of contrasting tidewater glaciers draining the Greenland Ice Sheet. To do this, high resolution estimates of ice velocity were generated and compared with detailed observations and modelling of the principal controls on seasonal glacier flow, including terminus position, ice mĂ©lange presence or absence, ice sheet surface melting and runoff, and plume presence or absence. These data revealed characteristic seasonal and shorter-term changes in ice velocity at each of the study glaciers in more detail than was available from previous remote sensing studies. Of all the environmental controls examined, seasonal evolution of subglacial hydrology (as inferred from plume observations and modelling) was best able to explain the observed ice flow variations, despite differences in geometry and flow of the study glaciers. The inferred relationships between subglacial hydrology and ice dynamics were furthermore entirely consistent with process-understanding developed at land-terminating sectors of the ice sheet. This investigation provides a more detailed understanding of tidewater glacier subglacial hydrology and its interaction with ice dynamics than was previously available and suggests that interannual variations in meltwater supply may have limited influence on annually averaged ice velocity. The thesis then shifts its attention from the glacier part of the system into the fjords, focusing on the interaction between icebergs, fjord circulation and fjord water properties. This focus on icebergs is motivated by recent research revealing that freshwater produced by iceberg melting constitutes an important component of fjord freshwater budgets, yet the impact of this freshwater on fjords was unknown. To investigate this, a new model for iceberg-ocean interaction is developed and incorporated into an ocean circulation model. This new model is first applied to Sermilik Fjord — a large fjord in east Greenland that hosts Helheim Glacier, one of the largest tidewater glaciers draining the ice sheet — to further constrain iceberg freshwater production and to quantify the influence of iceberg melting on fjord circulation and water properties. These investigations reveal that iceberg freshwater flux increases with ice sheet runoff raised to the power ~0.1 and ranges from ~500-2500 mÂł s⁻Âč during summer, with ~40% of that produced below the pycnocline. It is also shown that icebergs substantially modify the temperature and velocity structure of Sermilik Fjord, causing 1-5°C cooling in the upper ~100 m and invigorating fjord circulation, which in turn causes a 10-40% increase in oceanic heat flux towards Helheim Glacier. This research highlights the important role of icebergs in Greenland’s iceberg congested fjords and therefore the need to include them in future studies examining ice sheet – ocean interaction. Having investigated the effect of icebergs on fjord circulation in a realistic setting, this thesis then characterises the effect of submarine iceberg melting on water properties near the ice sheet – ocean interface by applying the new model to a range of idealised scenarios. This near-glacier region is one which is crucial for constraining ocean-driven retreat of tidewater glaciers, but which is poorly-understood. The simulations show that icebergs are important modifiers of glacier-adjacent water properties, generally acting to reduce vertical variations in water temperature. The iceberg-induced temperature changes will generally increase submarine melt rates at mid-depth and decrease rates at the surface, with less pronounced effects at greater depth. This highlights another mechanism by which iceberg melting can affect ice sheet – ocean interaction and emphasises the need to account for iceberg-ocean interaction when simulating ocean-driven retreat of Greenland’s tidewater glaciers. In summary, this thesis has helped to provide a deeper understanding of two poorly-understood components of Greenland’s tidewater glacier-fjord systems: (i) interactions between subglacial hydrology and ice velocity, and; (ii) iceberg-ocean interaction. This research has enabled more precise interpretations of past glacier behaviour and can be used to inform model development that will help constrain future ice sheet mass loss in response to a changing climate."I must express my gratitude to the University of St Andrews and to the Scottish Alliance for Geoscience, Environment and Society (SAGES) for funding and supporting me as a research student."-- Fundin

    Converging organoids and extracellular matrix::New insights into liver cancer biology

    Get PDF

    Converging organoids and extracellular matrix::New insights into liver cancer biology

    Get PDF
    Primary liver cancer, consisting primarily of hepatocellular carcinoma (HCC) and cholangiocarcinoma (CCA), is a heterogeneous malignancy with a dismal prognosis, resulting in the third leading cause of cancer mortality worldwide [1, 2]. It is characterized by unique histological features, late-stage diagnosis, a highly variable mutational landscape, and high levels of heterogeneity in biology and etiology [3-5]. Treatment options are limited, with surgical intervention the main curative option, although not available for the majority of patients which are diagnosed in an advanced stage. Major contributing factors to the complexity and limited treatment options are the interactions between primary tumor cells, non-neoplastic stromal and immune cells, and the extracellular matrix (ECM). ECM dysregulation plays a prominent role in multiple facets of liver cancer, including initiation and progression [6, 7]. HCC often develops in already damaged environments containing large areas of inflammation and fibrosis, while CCA is commonly characterized by significant desmoplasia, extensive formation of connective tissue surrounding the tumor [8, 9]. Thus, to gain a better understanding of liver cancer biology, sophisticated in vitro tumor models need to incorporate comprehensively the various aspects that together dictate liver cancer progression. Therefore, the aim of this thesis is to create in vitro liver cancer models through organoid technology approaches, allowing for novel insights into liver cancer biology and, in turn, providing potential avenues for therapeutic testing. To model primary epithelial liver cancer cells, organoid technology is employed in part I. To study and characterize the role of ECM in liver cancer, decellularization of tumor tissue, adjacent liver tissue, and distant metastatic organs (i.e. lung and lymph node) is described, characterized, and combined with organoid technology to create improved tissue engineered models for liver cancer in part II of this thesis. Chapter 1 provides a brief introduction into the concepts of liver cancer, cellular heterogeneity, decellularization and organoid technology. It also explains the rationale behind the work presented in this thesis. In-depth analysis of organoid technology and contrasting it to different in vitro cell culture systems employed for liver cancer modeling is done in chapter 2. Reliable establishment of liver cancer organoids is crucial for advancing translational applications of organoids, such as personalized medicine. Therefore, as described in chapter 3, a multi-center analysis was performed on establishment of liver cancer organoids. This revealed a global establishment efficiency rate of 28.2% (19.3% for hepatocellular carcinoma organoids (HCCO) and 36% for cholangiocarcinoma organoids (CCAO)). Additionally, potential solutions and future perspectives for increasing establishment are provided. Liver cancer organoids consist of solely primary epithelial tumor cells. To engineer an in vitro tumor model with the possibility of immunotherapy testing, CCAO were combined with immune cells in chapter 4. Co-culture of CCAO with peripheral blood mononuclear cells and/or allogenic T cells revealed an effective anti-tumor immune response, with distinct interpatient heterogeneity. These cytotoxic effects were mediated by cell-cell contact and release of soluble factors, albeit indirect killing through soluble factors was only observed in one organoid line. Thus, this model provided a first step towards developing immunotherapy for CCA on an individual patient level. Personalized medicine success is dependent on an organoids ability to recapitulate patient tissue faithfully. Therefore, in chapter 5 a novel organoid system was created in which branching morphogenesis was induced in cholangiocyte and CCA organoids. Branching cholangiocyte organoids self-organized into tubular structures, with high similarity to primary cholangiocytes, based on single-cell sequencing and functionality. Similarly, branching CCAO obtain a different morphology in vitro more similar to primary tumors. Moreover, these branching CCAO have a higher correlation to the transcriptomic profile of patient-paired tumor tissue and an increased drug resistance to gemcitabine and cisplatin, the standard chemotherapy regimen for CCA patients in the clinic. As discussed, CCAO represent the epithelial compartment of CCA. Proliferation, invasion, and metastasis of epithelial tumor cells is highly influenced by the interaction with their cellular and extracellular environment. The remodeling of various properties of the extracellular matrix (ECM), including stiffness, composition, alignment, and integrity, influences tumor progression. In chapter 6 the alterations of the ECM in solid tumors and the translational impact of our increased understanding of these alterations is discussed. The success of ECM-related cancer therapy development requires an intimate understanding of the malignancy-induced changes to the ECM. This principle was applied to liver cancer in chapter 7, whereby through a integrative molecular and mechanical approach the dysregulation of liver cancer ECM was characterized. An optimized agitation-based decellularization protocol was established for primary liver cancer (HCC and CCA) and paired adjacent tissue (HCC-ADJ and CCA-ADJ). Novel malignancy-related ECM protein signatures were found, which were previously overlooked in liver cancer transcriptomic data. Additionally, the mechanical characteristics were probed, which revealed divergent macro- and micro-scale mechanical properties and a higher alignment of collagen in CCA. This study provided a better understanding of ECM alterations during liver cancer as well as a potential scaffold for culture of organoids. This was applied to CCA in chapter 8 by combining decellularized CCA tumor ECM and tumor-free liver ECM with CCAO to study cell-matrix interactions. Culture of CCAO in tumor ECM resulted in a transcriptome closely resembling in vivo patient tumor tissue, and was accompanied by an increase in chemo resistance. In tumor-free liver ECM, devoid of desmoplasia, CCAO initiated a desmoplastic reaction through increased collagen production. If desmoplasia was already present, distinct ECM proteins were produced by the organoids. These were tumor-related proteins associated with poor patient survival. To extend this method of studying cell-matrix interactions to a metastatic setting, lung and lymph node tissue was decellularized and recellularized with CCAO in chapter 9, as these are common locations of metastasis in CCA. Decellularization resulted in removal of cells while preserving ECM structure and protein composition, linked to tissue-specific functioning hallmarks. Recellularization revealed that lung and lymph node ECM induced different gene expression profiles in the organoids, related to cancer stem cell phenotype, cell-ECM integrin binding, and epithelial-to-mesenchymal transition. Furthermore, the metabolic activity of CCAO in lung and lymph node was significantly influenced by the metastatic location, the original characteristics of the patient tumor, and the donor of the target organ. The previously described in vitro tumor models utilized decellularized scaffolds with native structure. Decellularized ECM can also be used for creation of tissue-specific hydrogels through digestion and gelation procedures. These hydrogels were created from both porcine and human livers in chapter 10. The liver ECM-based hydrogels were used to initiate and culture healthy cholangiocyte organoids, which maintained cholangiocyte marker expression, thus providing an alternative for initiation of organoids in BME. Building upon this, in chapter 11 human liver ECM-based extracts were used in combination with a one-step microfluidic encapsulation method to produce size standardized CCAO. The established system can facilitate the reduction of size variability conventionally seen in organoid culture by providing uniform scaffolding. Encapsulated CCAO retained their stem cell phenotype and were amendable to drug screening, showing the feasibility of scalable production of CCAO for throughput drug screening approaches. Lastly, Chapter 12 provides a global discussion and future outlook on tumor tissue engineering strategies for liver cancer, using organoid technology and decellularization. Combining multiple aspects of liver cancer, both cellular and extracellular, with tissue engineering strategies provides advanced tumor models that can delineate fundamental mechanistic insights as well as provide a platform for drug screening approaches.<br/

    Multidisciplinary perspectives on Artificial Intelligence and the law

    Get PDF
    This open access book presents an interdisciplinary, multi-authored, edited collection of chapters on Artificial Intelligence (‘AI’) and the Law. AI technology has come to play a central role in the modern data economy. Through a combination of increased computing power, the growing availability of data and the advancement of algorithms, AI has now become an umbrella term for some of the most transformational technological breakthroughs of this age. The importance of AI stems from both the opportunities that it offers and the challenges that it entails. While AI applications hold the promise of economic growth and efficiency gains, they also create significant risks and uncertainty. The potential and perils of AI have thus come to dominate modern discussions of technology and ethics – and although AI was initially allowed to largely develop without guidelines or rules, few would deny that the law is set to play a fundamental role in shaping the future of AI. As the debate over AI is far from over, the need for rigorous analysis has never been greater. This book thus brings together contributors from different fields and backgrounds to explore how the law might provide answers to some of the most pressing questions raised by AI. An outcome of the Católica Research Centre for the Future of Law and its interdisciplinary working group on Law and Artificial Intelligence, it includes contributions by leading scholars in the fields of technology, ethics and the law.info:eu-repo/semantics/publishedVersio

    Une méthode de mesure du mouvement humain pour la programmation par démonstration

    Full text link
    Programming by demonstration (PbD) is an intuitive approach to impart a task to a robot from one or several demonstrations by the human teacher. The acquisition of the demonstrations involves the solution of the correspondence problem when the teacher and the learner differ in sensing and actuation. Kinesthetic guidance is widely used to perform demonstrations. With such a method, the robot is manipulated by the teacher and the demonstrations are recorded by the robot's encoders. In this way, the correspondence problem is trivial but the teacher dexterity is afflicted which may impact the PbD process. Methods that are more practical for the teacher usually require the identification of some mappings to solve the correspondence problem. The demonstration acquisition method is based on a compromise between the difficulty of identifying these mappings, the level of accuracy of the recorded elements and the user-friendliness and convenience for the teacher. This thesis proposes an inertial human motion tracking method based on inertial measurement units (IMUs) for PbD for pick-and-place tasks. Compared to kinesthetic guidance, IMUs are convenient and easy to use but can present a limited accuracy. Their potential for PbD applications is investigated. To estimate the trajectory of the teacher's hand, 3 IMUs are placed on her/his arm segments (arm, forearm and hand) to estimate their orientations. A specific method is proposed to partially compensate the well-known drift of the sensor orientation estimation around the gravity direction by exploiting the particular configuration of the demonstration. This method, called heading reset, is based on the assumption that the sensor passes through its original heading with stationary phases several times during the demonstration. The heading reset is implemented in an integration and vector observation algorithm. Several experiments illustrate the advantages of this heading reset. A comprehensive inertial human hand motion tracking (IHMT) method for PbD is then developed. It includes an initialization procedure to estimate the orientation of each sensor with respect to the human arm segment and the initial orientation of the sensor with respect to the teacher attached frame. The procedure involves a rotation and a static position of the extended arm. The measurement system is thus robust with respect to the positioning of the sensors on the segments. A procedure for estimating the position of the human teacher relative to the robot and a calibration procedure for the parameters of the method are also proposed. At the end, the error of the human hand trajectory is measured experimentally and is found in an interval between 28.528.5 mm and 61.861.8 mm. The mappings to solve the correspondence problem are identified. Unfortunately, the observed level of accuracy of this IHMT method is not sufficient for a PbD process. In order to reach the necessary level of accuracy, a method is proposed to correct the hand trajectory obtained by IHMT using vision data. A vision system presents a certain complementarity with inertial sensors. For the sake of simplicity and robustness, the vision system only tracks the objects but not the teacher. The correction is based on so-called Positions Of Interest (POIs) and involves 3 steps: the identification of the POIs in the inertial and vision data, the pairing of the hand POIs to objects POIs that correspond to the same action in the task, and finally, the correction of the hand trajectory based on the pairs of POIs. The complete method for demonstration acquisition is experimentally evaluated in a full PbD process. This experiment reveals the advantages of the proposed method over kinesthesy in the context of this work.La programmation par dĂ©monstration est une approche intuitive permettant de transmettre une tĂąche Ă  un robot Ă  partir d'une ou plusieurs dĂ©monstrations faites par un enseignant humain. L'acquisition des dĂ©monstrations nĂ©cessite cependant la rĂ©solution d'un problĂšme de correspondance quand les systĂšmes sensitifs et moteurs de l'enseignant et de l'apprenant diffĂšrent. De nombreux travaux utilisent des dĂ©monstrations faites par kinesthĂ©sie, i.e., l'enseignant manipule directement le robot pour lui faire faire la tĂąche. Ce dernier enregistre ses mouvements grĂące Ă  ses propres encodeurs. De cette façon, le problĂšme de correspondance est trivial. Lors de telles dĂ©monstrations, la dextĂ©ritĂ© de l'enseignant peut ĂȘtre altĂ©rĂ©e et impacter tout le processus de programmation par dĂ©monstration. Les mĂ©thodes d'acquisition de dĂ©monstration moins invalidantes pour l'enseignant nĂ©cessitent souvent des procĂ©dures spĂ©cifiques pour rĂ©soudre le problĂšme de correspondance. Ainsi l'acquisition des dĂ©monstrations se base sur un compromis entre complexitĂ© de ces procĂ©dures, le niveau de prĂ©cision des Ă©lĂ©ments enregistrĂ©s et la commoditĂ© pour l'enseignant. Cette thĂšse propose ainsi une mĂ©thode de mesure du mouvement humain par capteurs inertiels pour la programmation par dĂ©monstration de tĂąches de ``pick-and-place''. Les capteurs inertiels sont en effet pratiques et faciles Ă  utiliser, mais sont d'une prĂ©cision limitĂ©e. Nous Ă©tudions leur potentiel pour la programmation par dĂ©monstration. Pour estimer la trajectoire de la main de l'enseignant, des capteurs inertiels sont placĂ©s sur son bras, son avant-bras et sa main afin d'estimer leurs orientations. Une mĂ©thode est proposĂ©e afin de compenser partiellement la dĂ©rive de l'estimation de l'orientation des capteurs autour de la direction de la gravitĂ©. Cette mĂ©thode, appelĂ©e ``heading reset'', est basĂ©e sur l'hypothĂšse que le capteur passe plusieurs fois par son azimut initial avec des phases stationnaires lors d'une dĂ©monstration. Cette mĂ©thode est implĂ©mentĂ©e dans un algorithme d'intĂ©gration et d'observation de vecteur. Des expĂ©riences illustrent les avantages du ``heading reset''. Cette thĂšse dĂ©veloppe ensuite une mĂ©thode complĂšte de mesure des mouvements de la main humaine par capteurs inertiels (IHMT). Elle comprend une premiĂšre procĂ©dure d'initialisation pour estimer l'orientation des capteurs par rapport aux segments du bras humain ainsi que l'orientation initiale des capteurs par rapport au repĂšre de rĂ©fĂ©rence de l'humain. Cette procĂ©dure, consistant en une rotation et une position statique du bras tendu, est robuste au positionnement des capteurs. Une seconde procĂ©dure est proposĂ©e pour estimer la position de l'humain par rapport au robot et pour calibrer les paramĂštres de la mĂ©thode. Finalement, l'erreur moyenne sur la trajectoire de la main humaine est mesurĂ©e expĂ©rimentalement entre 28.5 mm et 61.8 mm, ce qui n'est cependant pas suffisant pour la programmation par dĂ©monstration. Afin d'atteindre le niveau de prĂ©cision nĂ©cessaire, une nouvelle mĂ©thode est dĂ©veloppĂ©e afin de corriger la trajectoire de la main par IHMT Ă  partir de donnĂ©es issues d'un systĂšme de vision, complĂ©mentaire des capteurs inertiels. Pour maintenir une certaine simplicitĂ© et robustesse, le systĂšme de vision ne suit que les objets et pas l'enseignant. La mĂ©thode de correction, basĂ©e sur des ``Positions Of Interest (POIs)'', est constituĂ©e de 3 Ă©tapes: l'identification des POIs dans les donnĂ©es issues des capteurs inertiels et du systĂšme de vision, puis l'association de POIs liĂ©es Ă  la main et de POIs liĂ©es aux objets correspondant Ă  la mĂȘme action, et enfin, la correction de la trajectoire de la main Ă  partir des paires de POIs. Finalement, la mĂ©thode IHMT corrigĂ©e est expĂ©rimentalement Ă©valuĂ©e dans un processus complet de programmation par dĂ©monstration. Cette expĂ©rience montre l'avantage de la mĂ©thode proposĂ©e sur la kinesthĂ©sie dans le contexte de ce travail

    LIPIcs, Volume 251, ITCS 2023, Complete Volume

    Get PDF
    LIPIcs, Volume 251, ITCS 2023, Complete Volum

    Machine learning applications in search algorithms for gravitational waves from compact binary mergers

    Get PDF
    Gravitational waves from compact binary mergers are now routinely observed by Earth-bound detectors. These observations enable exciting new science, as they have opened a new window to the Universe. However, extracting gravitational-wave signals from the noisy detector data is a challenging problem. The most sensitive search algorithms for compact binary mergers use matched filtering, an algorithm that compares the data with a set of expected template signals. As detectors are upgraded and more sophisticated signal models become available, the number of required templates will increase, which can make some sources computationally prohibitive to search for. The computational cost is of particular concern when low-latency alerts should be issued to maximize the time for electromagnetic follow-up observations. One potential solution to reduce computational requirements that has started to be explored in the last decade is machine learning. However, different proposed deep learning searches target varying parameter spaces and use metrics that are not always comparable to existing literature. Consequently, a clear picture of the capabilities of machine learning searches has been sorely missing. In this thesis, we closely examine the sensitivity of various deep learning gravitational-wave search algorithms and introduce new methods to detect signals from binary black hole and binary neutron star mergers at previously untested statistical confidence levels. By using the sensitive distance as our core metric, we allow for a direct comparison of our algorithms to state-of-the-art search pipelines. As part of this thesis, we organized a global mock data challenge to create a benchmark for machine learning search algorithms targeting compact binaries. This way, the tools developed in this thesis are made available to the greater community by publishing them as open source software. Our studies show that, depending on the parameter space, deep learning gravitational-wave search algorithms are already competitive with current production search pipelines. We also find that strategies developed for traditional searches can be effectively adapted to their machine learning counterparts. In regions where matched filtering becomes computationally expensive, available deep learning algorithms are also limited in their capability. We find reduced sensitivity to long duration signals compared to the excellent results for short-duration binary black hole signals

    Investigating the learning potential of the Second Quantum Revolution: development of an approach for secondary school students

    Get PDF
    In recent years we have witnessed important changes: the Second Quantum Revolution is in the spotlight of many countries, and it is creating a new generation of technologies. To unlock the potential of the Second Quantum Revolution, several countries have launched strategic plans and research programs that finance and set the pace of research and development of these new technologies (like the Quantum Flagship, the National Quantum Initiative Act and so on). The increasing pace of technological changes is also challenging science education and institutional systems, requiring them to help to prepare new generations of experts. This work is placed within physics education research and contributes to the challenge by developing an approach and a course about the Second Quantum Revolution. The aims are to promote quantum literacy and, in particular, to value from a cultural and educational perspective the Second Revolution. The dissertation is articulated in two parts. In the first, we unpack the Second Quantum Revolution from a cultural perspective and shed light on the main revolutionary aspects that are elevated to the rank of principles implemented in the design of a course for secondary school students, prospective and in-service teachers. The design process and the educational reconstruction of the activities are presented as well as the results of a pilot study conducted to investigate the impact of the approach on students' understanding and to gather feedback to refine and improve the instructional materials. The second part consists of the exploration of the Second Quantum Revolution as a context to introduce some basic concepts of quantum physics. We present the results of an implementation with secondary school students to investigate if and to what extent external representations could play any role to promote students’ understanding and acceptance of quantum physics as a personal reliable description of the world

    Reconstruction and Synthesis of Human-Scene Interaction

    Get PDF
    In this thesis, we argue that the 3D scene is vital for understanding, reconstructing, and synthesizing human motion. We present several approaches which take the scene into consideration in reconstructing and synthesizing Human-Scene Interaction (HSI). We first observe that state-of-the-art pose estimation methods ignore the 3D scene and hence reconstruct poses that are inconsistent with the scene. We address this by proposing a pose estimation method that takes the 3D scene explicitly into account. We call our method PROX for Proximal Relationships with Object eXclusion. We leverage the data generated using PROX and build a method to automatically place 3D scans of people with clothing in scenes. The core novelty of our method is encoding the proximal relationships between the human and the scene in a novel HSI model, called POSA for Pose with prOximitieS and contActs. POSA is limited to static HSI, however. We propose a real-time method for synthesizing dynamic HSI, which we call SAMP for Scene-Aware Motion Prediction. SAMP enables virtual humans to navigate cluttered indoor scenes and naturally interact with objects. Data-driven kinematic models, like SAMP, can produce high-quality motion when applied in environments similar to those shown in the dataset. However, when applied to new scenarios, kinematic models can struggle to generate realistic behaviors that respect scene constraints. In contrast, we present InterPhys which uses adversarial imitation learning and reinforcement learning to train physically-simulated characters that perform scene interaction tasks in a physical and life-like manner
    • 

    corecore