44207 research outputs found
Sort by
Nucleolar stress and IL-1 signaling in hematopoietic stem cell aging
The aging of the hematopoietic system is driven in part by defects occurring in hematopoietic stem cells (HSC). Given that HSCs provide the organism with blood and immune cells lifelong, understanding the mechanisms underlying HSC aging is vital to develop interventions that address the deterioration of the hematopoietic system at its root. Past work has indicated roles for both intrinsic and extrinsic processes in driving HSC decline during aging. Still, their roles are not fully understood, especially the relationship between different drivers, and the mechanisms by which HSCs maintain functionality in the face of age-related insults.
To better understand cell-intrinsic regulation of HSC aging, we investigated nucleolar DNA damage marks stemming from replication stress in old HSCs, and connected it with nucleolar stress induction which impairs protein translation and cell cycling. Although nucleolar stress dampens old HSC activity, we reveal the cytoprotective effect of the p53-mediated nucleolar stress response to be essential for preserving the residual potential of old HSCs.
Additionally, though inflammation from the niche contributes to HSC aging, the exact role of microenvironmental alterations often remains unclear. Here, we uncover an important role for IL-1 derived from endosteal stromal cells in driving both HSC and niche cell aging, and demonstrate inhibition of IL-1 signaling as a tractable strategy that counters niche deterioration to improve HSC function. These findings unveil new mechanisms of HSC aging, raise the possibility that nucleolar stress signaling could be harnessed to improve the output of old HSCs in clinical settings, and demonstrate the therapeutic viability of IL-1 blockade in improving old HSC function
Optical-Based Microsecond Latency MHD Mode Tracking Through Deep Learning
Active feedback control in magnetic confinement fusion devices is desirable to mitigate plasma instabilities and enable robust operation. Among various diagnostics, optical high-speed cameras provide a powerful, non-invasive diagnostic and can be suitable for these applications.
This thesis reports the first application of high-speed imaging videography and deep learning as real-time diagnostics of rotating MHD modes in a tokamak device. The developed system uses a convolutional neural network (CNN) to predict the amplitudes of the ?=1 sine and cosine mode components using solely optical measurements acquired from one or more cameras. Using the newly assembled high-speed camera diagnostics on the High Beta Tokamak โ Extended Pulse (HBT-EP) device, an experimental dataset consisting of camera frame images and magnetic-based mode measurements was assembled and used to develop the mode-tracking CNN model. The optimized models outperformed other tested conventional algorithms given identical image inputs.
A prototype controller based on a field-programmable gate array (FPGA) hardware was developed to perform real-time mode tracking using the high-speed camera diagnostic with the mode-tracking CNN model. In this system, a trained model was directly implemented in the firmware of an FPGA device onboard the frame grabber hardware of the cameraโs data readout system. Adjusting the model size and its implementation-related parameters allowed achieving an optimal trade-off between a modelโs prediction accuracy, its FPGA resource utilization and inference speed. Through fine-tuning these parameters, the final implementation satisfied all of the design constraints, achieving a total trigger-to-output latency of 17.6 ?s and a throughput of up to 120 kfps. These results are on-par with the existing GPU-based control system using magnetic sensor diagnostic, indicating that the camera-based controller will be capable to perform active feedback control of MHD modes on HBT-EP
Novel Image Acquisition and Reconstruction Methods: Towards Autonomous MRI
Magnetic Resonance Imaging (MR Imaging, or MRI) offers superior soft-tissue contrast compared to other medical imaging modalities. However, access to MRI across developing countries ranges from prohibitive to scarcely available. The lack of educational facilities and the excessive costs involved in imparting technical training have resulted in a lack of skilled human resources required to operate MRI systems in developing countries.
While diagnostic medical imaging improves the utilization of facility-based rural health services and impacts management decisions, MRI requires technical expertise to set up the patient, acquire, visualize, and interpret data. The availability of such local expertise in underserved geographies is challenging. Inefficient workflows and usage of MRI result in challenges related to financial and temporal access in countries with higher scanner densities than the global average of 5.3 per million people.
MRI is routinely employed for neuroimaging and, in particular, for dementia screening. Dementia affected 50 million people worldwide in 2018, with an estimated economic impact of US $1 trillion a year, and Alzheimerโs Disease (AD) accounts for up to 60โ80% of dementia cases. However, AD-imaging using MRI is time-consuming, and protocol optimization to accelerate MR Imaging requires local expertise since each pulse sequence involves multiple configurable parameters that need optimization for acquisition time, image contrast, and image quality. The lack of this expertise contributes to the highly inefficient utilization of MRI services, diminishing their clinical value.
Augmenting human capabilities can tackle these challenges and standardize the practice. Autonomous and time-efficient acquisition, reconstruction, and visualization schemes to maximize MRI hardware usage and solutions that reduce reliance on human operation of MRI systems could alleviate some of the challenges associated with the requirement/absence of skilled human resources.
We first present a preliminary demonstration of AMRI that simplifies the end-to-end MRI workflow of registering the subject, setting up and invoking an imaging session, acquiring and reconstructing the data, and visualizing the images. Our initial implementation of AMRI separates the required intelligence and user interaction from the acquisition hardware. AMRI performs intelligent protocolling and intelligent slice planning. Intelligent protocolling optimizes contrast value while satisfying signal-to-noise ratio and acquisition time constraints. We acquired data from four healthy volunteers across three experiments that differed in acquisition time constraints. AMRI achieved comparable image quality across all experiments despite optimizing for acquisition duration, therefore indirectly optimizing for MR Value โ a metric to quantify the value of MRI. We believe we have demonstrated the first Autonomous MRI of the brain. We also present preliminary results from a deep learning (DL) tool for generating first-read text-based radiological reports directly from input brain images. It can potentially alleviate the burden on radiologists who experience the seventh-highest levels of burnout among all physicians, according to a 2015 survey.
Next, we accelerate the routine brain imaging protocol employed at the Columbia University Irving Medical Center and leverage DL methods to boost image quality via image-denoising. Since MR physics dictates that the volume of the object being imaged influences the amount of signal received, we also demonstrate subject-specific image-denoising. The accelerated protocol resulted in a factor of 1.94 gain in imaging throughput, translating to a 72.51% increase in MR Value. We also demonstrate that this accelerated protocol can potentially be employed for AD imaging.
Finally, we present ArtifactID โ a DL tool to identify Gibbs ringing in low-field (0.36 T) and high-field (1.5 T and 3.0 T) brain MRI. We train separate binary classification models for low-field and high-field data, and visual explanations are generated via the Grad-CAM explainable AI method to help develop trust in the modelsโ predictions. We also demonstrate detecting motion using an accelerometer in a low-field MRI scanner since low-field MRI is prone to artifacts.
In conclusion, our novel contributions in this work include: i) a software framework to demonstrate an initial implementation of autonomous brain imaging; ii) an end-to-end framework that leverages intelligent protocolling and DL-based image-denoising that can potentially be employed for accelerated AD imaging; and iii) a DL-based tool for automated identification of Gibbs ringing artifacts that may interfere with diagnosis at the time of radiological reading.
We envision AMRI augmenting human expertise to alleviate the challenges associated with the scarcity of skilled human resources and contributing to globally accessible MRI
Using heterogeneous, longitudinal EHR data for risk assessment and early detection of cardiovascular disease
Cardiovascular disease (CVD) affects millions of people and is a leading cause of death worldwide. CVD consists of a broad set of conditions including structural heart disease, coronary artery disease and stroke. Risk for each of these conditions accumulates over long periods of time depending on several risk factors. In order to reduce morbidity and mortality due to CVD, preventative treatments administered prior to first CVD event are critical. According to clinical guidelines, such treatments should be guided by an individualโs total risk within a window of time. A related objective is secondary prevention, or early detection, wherein the aim is to identify and mitigate the impact of a disease that has already taken effect. With the widespread adoption of electronic health records (EHRs), there is tremendous opportunity to build better methods for risk assessment and early detection.
However, existing methods which use EHRs are limited in several ways: (1) they do not leverage the full longitudinal history of patients, (2) they use a limited feature set or specific data modalities, and (3) they are rarely validated in broader populations and across different institutions. In this dissertation, I address each of these limitations. In Aim 1, I explore the challenge of handling longitudinal, irregularly sampled clinical data, proposing discriminative and generative approaches to model this data. In Aim 2, I develop a multimodal approach for the early detection of structural heart disease.
Finally, in Aim 3, I study how different feature inclusion choices affect the transportability of deep risk assessment models of coronary artery disease across institutions. Collectively, this dissertation contributes important insights towards building better approaches for risk assessment and early detection of CVD using EHR data and systematically assessing their transportability across institutions and populations
Unraveling Canvas: from Bellini to Tintoretto
Over the course of the fifteenth and sixteenth centuries, canvas substituted panel or wall as the preferred support for painting in Venice, moving from the periphery to the core of artmaking. As it did so, canvas became key to the artistic processes and novel pictorial language developed by painters like Titian, Tintoretto and Veronese. Sixteenth-century critics associated canvas with painting in Venice, a connection that has persisted to become a veritable trope of Venetian art history. Despite this, we have hitherto lacked a convincing account of Venetian canvas supports and their impact. This dissertation, by examining the adoption, development, and significance of canvas in Venetian art over the period 1400 to 1600, attempts to provide one.
Approaching canvas from multiple perspectives, this project offers a deeper understanding of what early modern canvas was at a material level, how it was made and supplied to painters, and its catalyzing role in early modern Venetian art. By tracing precisely how canvas operates within paintings, focusing on lodestar examples whilst drawing on extensive and intensive object-based research carried out on a large corpus, this thesis demonstrates how actively canvas participated in the elaboration of the pictorial poetics of mature Cinquecento art in Venice. It argues that we owe the existence of this distinctive artistic idiom in no small part to the twist of a yarn, the roughness of a thread, the thickness of a stitch. Canvas was critical to both the making and the meaning of these pictures.
The wider aims of the project are twofold: on the one hand, to model a methodology that integrates approaches such as visual, textual, and sociocultural analysis with technical art history and conservation-informed comprehension of the materially altered nature of art objects; on the other, to contribute to an account of the history of an art formโthe canvas pictureโthat still occupies a central role in the global art world today
Legitimation Trials. The Limits of Liberal Government and the Federal Reserve's Quest for Embedded Autonomy
Economic sociologists have long produced rich accounts of the economyโs embeddedness in social relations and the hybridity of contemporary governance architectures. However, all too often, they contented themselves with merely disenchanting a liberal ontology that divides the social world into neatly differentiated spheres, such as the state and the economy or the public and the private. In this dissertation, I argue that this is not enough. If we want to understand actually existing economic government, we also need to attend to the consequences of its persistent violation of the precepts of liberal order.
This dissertation does so by accounting for the simultaneity of the Federal Reserveโs rise to the commanding heights of the US economy and the repeated, multi-pronged controversies over it. I contend that together, the Fedโs ascendance and the controversies surrounding it are symptomatic of the contradictions inherent to a liberal mode of governing โthe economyโ which, on the one hand, professes its investment in a clear boundary between the state and the economy but which, on the other hand, operationally rests on their entanglement. Its embeddedness in financial markets exposes the Fed to attacks that it is either colluding with finance or that it unduly smuggles in political considerations into an otherwise apolitical economy.
In response, to secure its legitimacy as a neutral arbiter of market struggles, the Fed needs to invest in autonomization strategies to demonstrate that it is acting neither in the interests of capital nor on behalf of partisan politicians but in the public interest. Its autonomization strategies in turn feed back onto the modes of embeddedness and governing techniques the Fed deploys, often resulting in new controversies. Combining insights from economic sociology and the sociology of expertise, the perspective developed in this dissertation thus foregrounds the persistent tension between embeddedness and autonomy and the sequences of reiterated problem-solving it gives rise to.Based on extensive archival research and interviews with actors, I reconstruct three such sequences in the Fedโs more-than-a-century long quest for embedded autonomy in three independent but related empirical essays.
The first focuses on the decade immediately following the Federal Reserve Systemโs founding in 1913. It traces how the confluence of democratic turmoil in the wake of World War I, its hybrid organizational structure, and an alliance with institutionalist economists led Fed policymakers to repurpose open market operations from a banking technique into a policy tool that reconciled different interests. This made it possible to take on a task no other central bank had attempted before: mitigating depressions. This major innovation briefly turned the Fed into โthe chief stabilizerโ before it failed to fulfill this role during the Great Depression. The essay thus adds a critical, oft-forgotten episode to the genealogy of the Fedโs ascendancy and the rise of central banks to the foremost macroeconomic managers of our time.
The second essay most explicitly develops the theoretical argument underlying this dissertation and applies it to a practice that has been all but ignored in the scholarship on central banking and financial government: bank supervision. Emphasizing its distinctiveness from regulation, I reconstruct how the Fed folded supervision into its project of governing finance as a vital, yet vulnerable system over the course of the second half of the 20th century and into the 21st. I especially focus on the Fedโs autonomization strategies in the wake of the 2008 Great Financial Crisis and its internal struggles which resulted in a more standardized, quantitative, and transparent supervisory process centered around the technique of stress testing. However, the Fedโs efforts to reassert its autonomy and authority have in the meantime become attacked themselves. The essay traces these controversies, and subsequent reforms, to the present day, further demonstrating the recursive dynamic of the Fedโs quest for embedded autonomy.
The third essay finally zooms in on a single event during the Great Financial Crisis: the first major public stress test run by the Fed and the Treasury between February and May 2009. By reconstructing its socio-technical assembling in detail and comparing it to the failures of stress tests run by European agencies between 2009 and 2011, I show that the stress testโs success rested on a reconfiguration of the stateโs embeddedness in financial circuits, allowing the Treasuryโs material and symbolic capital to back the exercise and the Fed to function as a conduit that iteratively gauged and shaped its audiencesโ expectations as to what a credible test would look like. This made it possible to successfully frame the test as an autonomous exercise based on expertise. Probing the structural, socio-technical, and performative conditions of the Fedโs claims to legitimacy, the essay thus resolves the โmysteryโ (Paul Krugman) how a simulation technique could become a watershed event in the greatest financial crisis in a lifetime
Evaluating the Promise of Biological Aging as a Leading Indicator of Population Health
Several substantive observations formed the basis for this research. First, the observation of stagnating life expectancy in the United States over the first two decades of the 21st century, representing a dubious form of American exceptionalism. Second, evidence suggesting that novel measures of biological aging might provide allow for early evaluation of population-level health trajectories, based on direct observation of health status in still-living people. Third, the opportunity to apply these measures for study of population-level phenomena, using methods routinely used in the fields of sociology, demography, and economics. This dissertation represents a proof-of-concept work to support the application of biological aging measures to population health surveillance.
In Chapter 2, I conduct a systematic literature review of novel measures and approaches to the quantification of population aging published since 2000, and identify 3 major classes of novel population aging measures. Biological-aging measures can be understood as a specific application of Sanderson and Scherbovโs ฮฑ-ages approach, which indexes โtrue ageโ to the distribution of some aging-related characteristic in a reference sample. Relative to other novel measures and approaches, however, biological-aging algorithms hold particular promise in their ability to provide direct measures of pre-clinical, aging-related health risk across the entire adult age range of a population.
In chapters 3 and 4, I apply published biological aging algorithms to blood-chemistry and organ-test data collected by the National Health and Nutrition Examination Surveys (NHANES) to test whether the U.S. population has grown biologically older over the past two decades, as some interpretations of life expectancy data would suggest, and to evaluate the extent to which selected social and environmental exposures might explain these trends. Formal age-period-cohort analysis revealed consistent period increases in biological aging from 1999-2018; while population aging slowed after the training cohort was measured in NHANES III (1988-1994), aging trajectories have reverted towards early-1990s levels since the turn of the century. Limited evidence of cohort effects was observed, with findings consistent regardless of age, race, and sex โ although racial disparities in biological aging persisted over the entire study period. Kitagawa-Blinder-Oaxaca decomposition analysis of four candidate exposures (i.e., BMI, smoking status, blood lead, and urinary polycyclic aromatic hydrocarbon levels) suggested that changes in the distribution of behavioral and environmental risk factors accounted for a substantial proportion of observed period trends and/or racial disparities in biological aging over the first two decades of the 21st century. Broadly, these results suggest that measures of biological aging can provide earlier and more precise readouts of population health trajectories and their drivers, ultimately informing next-generation public health efforts to promote healthy aging and aging health equity
Scheduling and Routing under Uncertainty with Predictions
Uncertainty surrounds us daily, indicating the need for effective decision-making strategies. In recent years, the large amount of available data has accelerated the development of novel methods for decision-making and optimization. This thesis studies this inquiry, centering on a framework that employs predictions to enhance decision-making in various optimization problems.
We investigate scheduling and routing problems, which are fundamental in the field of sequential decision-making and optimization, within the framework of algorithms with predictions. Our goal is to improve performance by integrating predictions of unknown input parameters. The central question is: โCan we design algorithms that use predictions to enhance performance when the prediction is accurate while still maintaining worst-case guarantees, even when the predictions are inaccurate?โ
Through theoretical and experimental analyses, we demonstrate that by incorporating appropriate predictions of unknown input parameters, we design algorithms to outperform existing results when predictions are accurate while maintaining worst-case guarantees even when the predictions are significantly erroneous
Spectral-switching analysis reveals real-time neuronal network representations of concurrent spontaneous naturalistic behaviors in human brain
Over 30 years of functional imaging studies have demonstrated that the human brain operates as a complex and interconnected system, with distinct functional networks and long-range coordination of neural activity. Yet, how our brains coordinate our behavior from moment to moment, permitting us to think, talk, and move at the same time, has been almost impossible to decode (Chapter 1).
The invasive, long-term, and often multi-regional iEEG monitoring utilized for epilepsy surgery evaluation presents a valuable opportunity for studying brain-wide dynamic neural activity in behaving human subjects. In this study, we analyzed over 93 hours of iEEG recordings along with simultaneously acquired video recordings from 10 patients with drug-resistant focal epilepsy syndromes, who underwent invasive iEEG with broadly distributed bilateral depth electrodes for clinical evaluation.
Initially, we explored the dynamic connectivity patterns quantified from band-limited neural activities using metrics from previous literature in a subset of subjects. These metrics can characterize long-range connectivity across brain regions and reveal variations over time. They have shown success in identifying state differences using controlled task presentations and trial-averaged data. However, we found that replicating this success with naturalistic, complex behaviors in our subjects is challenging. Although they demonstrate differences across wake and sleep states, they are less sensitive in differentiating more complicated and subtle state transitions during wakefulness. In addition, patterns identified from individual frequency bands exhibit patient-to-patient differences, making it difficult to generalize results across frequency bands and subjects. (Chapter 2).
Inspired by clinical electrocortical stimulation mapping studies, which seek to identify critical brain sites for language and motor function, and the frequency gradient observed from human scalp and intracranial EEG recordings, we developed a new approach to meet the requirements for real-time analysis and frequency band selection. It is worth mentioning that detecting state transitions in naturalistic behavior requires analyzing raw EEG during individual transitions. We refer to this as "real-time analysis," to distinguish it from formal task performance and trial-averaging techniques. Rather than representing data as time-varying signals within specific frequency bands, we incorporated all frequencies (2-55 Hz) into our analysis by calculating the power spectral density (PSD) at each electrode. This analysis confirmed that the human brainโs neural activity PSD is heterogenous, exhibiting a distinct topography with bilateral symmetry, consistent with prior resting-state MEG and iEEG studies. However, investigating the variability of each regionโs PSD over time (within a 2-second moving window), we discovered the tendency of individual electrode channel to switch back and forth between 2 distinct power spectral densities (PSDs, 2-55Hz) (Chapter 3).
We further recognized that this โspectral switchingโ occurs synchronously between distant sites, even between regions with differing baseline PSDs, revealing long-range functional networks that could be obscured in the analysis of individual frequency bands. Moreover, the real-time PSD-switching dynamics of specific networks exhibited striking alignment with activities such as conversation, hand movements, and eyes open versus closed, revealing a multi-threaded functional network representation of concurrent naturalistic behaviors. These network-behavior relationships were stable over multiple days but were altered during sleep, suggesting state-dependent plasticity of brain-wide network organization (Chapter 4).
Our results provide robust evidence for the presence of multiple synchronous neuronal networks across the human brain. The real-time PSD switching dynamics of these networks provide physiologically interpretable read-outs, demonstrating the parallel engagement of multiple brain regions in a range of concurrent naturalistic behaviors (Chapter 5)
Embodied Dialogic Spaces as Research Methodology for Students' Postgraduate Reflection on Their Dance Learning
This arts-based narrative research was an inquiry into how embodied dialogic spaces can provide access to dance learning reflections by students after graduating from a pre-professional dance program (DanceWorks, Berlin, and Palucca University, Dresden in Germany, and Duncan Center, Prague in the Czech Republic).
Dialogic spaces, a term used by few contemporary scholars, were examined as vital spaces of openness and inclusiveness in learning. Data was collected through different modalities (textual, visual, and embodied), included surveys with alternative assessment tools, interviews, and somatic dance narratives (SDN), personal dance solos, for โinnerโ listening and embodied exchanges between the researcher and participants.
The SDNs served as vital data reporting and reflected the embodied version of all data collected within a dialogic space between the researcher and the individual participants with an Interpretive Phenomenological Analysis (IPA) approach. The phenomenon of former studentsโ reflection on their dance learning and the research design within a dialogic space equally informed the researcherโs perspective and interpretive reporting of this study.
This research argued for the need for dialogic spacesโfor non-judgmental spaces prioritizing ontological over instrumental learning, not only during an education but also for identifying lifelong learning skills after an education has been completed. It aimed to explore the transformative possibilities of dialogic spaces and their impact on individual growth