Columbia University

Columbia University Academic Commons
Not a member yet
    17761 research outputs found

    Contested Sites of Feminine Agency: Ivory Grooming Implements in Late Medieval Europe

    No full text
    This dissertation contends with the diverse corpus of Gothic ivory grooming implements carved in France in the thirteenth and fourteenth centuries. Employing feminist, queer, posthumanist, and ecocritical methodologies, it explores these objects as tools in gender and identity formation. Attending to the complexity of medieval attitudes to grooming and women and to the polysemy of these objects’ iconographies, this dissertation argues for the inherent ambiguity of the bodies that constitute and were constituted by these tools. It participates in a broader project of revealing the inherent ambiguity of medieval gender and its deep enmeshment with the nonhuman animal world by presenting ivory beauty implements as nexuses of excess and resistance to feminine ideals. Calling attention to the body of the elephant as the source of the grooming tools’ materiality, its analysis demonstrates how the subjugation of the nonhuman animal reverberates through objects created to give order to human animal bodies, in particular the bestial female body. The material, iconographical, functional, and textual strands wound together in ivory grooming tools reveal the women of flesh and ivory to be far more multilayered and subversive, resourceful and complex, than scholarship has hitherto recognized. At once tools of subjugation and instruments to assert agency, in the hands of their users, ivory grooming tools become sites of identity expression and self-transformation

    Unraveling Canvas: from Bellini to Tintoretto

    No full text
    Over the course of the fifteenth and sixteenth centuries, canvas substituted panel or wall as the preferred support for painting in Venice, moving from the periphery to the core of artmaking. As it did so, canvas became key to the artistic processes and novel pictorial language developed by painters like Titian, Tintoretto and Veronese. Sixteenth-century critics associated canvas with painting in Venice, a connection that has persisted to become a veritable trope of Venetian art history. Despite this, we have hitherto lacked a convincing account of Venetian canvas supports and their impact. This dissertation, by examining the adoption, development, and significance of canvas in Venetian art over the period 1400 to 1600, attempts to provide one. Approaching canvas from multiple perspectives, this project offers a deeper understanding of what early modern canvas was at a material level, how it was made and supplied to painters, and its catalyzing role in early modern Venetian art. By tracing precisely how canvas operates within paintings, focusing on lodestar examples whilst drawing on extensive and intensive object-based research carried out on a large corpus, this thesis demonstrates how actively canvas participated in the elaboration of the pictorial poetics of mature Cinquecento art in Venice. It argues that we owe the existence of this distinctive artistic idiom in no small part to the twist of a yarn, the roughness of a thread, the thickness of a stitch. Canvas was critical to both the making and the meaning of these pictures. The wider aims of the project are twofold: on the one hand, to model a methodology that integrates approaches such as visual, textual, and sociocultural analysis with technical art history and conservation-informed comprehension of the materially altered nature of art objects; on the other, to contribute to an account of the history of an art form—the canvas picture—that still occupies a central role in the global art world today

    UVSSA regulates transcription-coupled genome maintenance

    Get PDF
    DNA damage is a constant threat to our genomes which drives genome instability and contributes to cancer progression. DNA damage interferes with important DNA transactions such as transcription and replication. DNA lesions are removed by repair pathways that ensure genome stability during transcription and replication. Here, we identify and characterize distinct roles for the ultra violet stimulated scaffold protein A (UVSSA) in the maintenance of genome stability during transcription in human cells. First, we unravel a novel function for UVSSA in transcription-coupled repair of DNA interstrand crosslinks (ICLs), genotoxic adducts that covalently bind opposing strands of the DNA and block transcription and replication. UVSSA knockout cells are sensitive to ICL inducing drugs, and UVSSA is specifically required for transcription-coupled repair of ICLs in a fluorescence-based reporter assay. Based on analysis of the UVSSA protein interactome in crosslinker treated cells we propose a model for transcription-coupled ICL repair (TC-ICR) that is initiated by stalling of transcribing RNA polymerase II (Pol II) at an ICL. Stalled Pol II is first bound by CSA and CSB, followed by UVSSA which recruits TFIIH to initiate downstream lesion removal steps. Second, we establish that UVSSA counteracts MYC dependent transcription stress to promote genome stability in cells aberrantly expressing the cMYC oncogene. UVSSA knockdown sensitizes cells to MYC expression, resulting in synthetic sickness and increased doubling time. UVSSA knockdown impacts Pol II dynamics in MYC activated cells. We conclude that UVSSA is required for regulation of Pol II during MYC induced transcription to prevent transcription stress. Together, these studies expand our understanding of UVSSA’s role in genome stability during transcription and elucidates the poorly understood transcription-coupled ICL repair pathway

    Using heterogeneous, longitudinal EHR data for risk assessment and early detection of cardiovascular disease

    No full text
    Cardiovascular disease (CVD) affects millions of people and is a leading cause of death worldwide. CVD consists of a broad set of conditions including structural heart disease, coronary artery disease and stroke. Risk for each of these conditions accumulates over long periods of time depending on several risk factors. In order to reduce morbidity and mortality due to CVD, preventative treatments administered prior to first CVD event are critical. According to clinical guidelines, such treatments should be guided by an individual’s total risk within a window of time. A related objective is secondary prevention, or early detection, wherein the aim is to identify and mitigate the impact of a disease that has already taken effect. With the widespread adoption of electronic health records (EHRs), there is tremendous opportunity to build better methods for risk assessment and early detection. However, existing methods which use EHRs are limited in several ways: (1) they do not leverage the full longitudinal history of patients, (2) they use a limited feature set or specific data modalities, and (3) they are rarely validated in broader populations and across different institutions. In this dissertation, I address each of these limitations. In Aim 1, I explore the challenge of handling longitudinal, irregularly sampled clinical data, proposing discriminative and generative approaches to model this data. In Aim 2, I develop a multimodal approach for the early detection of structural heart disease. Finally, in Aim 3, I study how different feature inclusion choices affect the transportability of deep risk assessment models of coronary artery disease across institutions. Collectively, this dissertation contributes important insights towards building better approaches for risk assessment and early detection of CVD using EHR data and systematically assessing their transportability across institutions and populations

    Legitimation Trials. The Limits of Liberal Government and the Federal Reserve's Quest for Embedded Autonomy

    No full text
    Economic sociologists have long produced rich accounts of the economy’s embeddedness in social relations and the hybridity of contemporary governance architectures. However, all too often, they contented themselves with merely disenchanting a liberal ontology that divides the social world into neatly differentiated spheres, such as the state and the economy or the public and the private. In this dissertation, I argue that this is not enough. If we want to understand actually existing economic government, we also need to attend to the consequences of its persistent violation of the precepts of liberal order. This dissertation does so by accounting for the simultaneity of the Federal Reserve’s rise to the commanding heights of the US economy and the repeated, multi-pronged controversies over it. I contend that together, the Fed’s ascendance and the controversies surrounding it are symptomatic of the contradictions inherent to a liberal mode of governing ‘the economy’ which, on the one hand, professes its investment in a clear boundary between the state and the economy but which, on the other hand, operationally rests on their entanglement. Its embeddedness in financial markets exposes the Fed to attacks that it is either colluding with finance or that it unduly smuggles in political considerations into an otherwise apolitical economy. In response, to secure its legitimacy as a neutral arbiter of market struggles, the Fed needs to invest in autonomization strategies to demonstrate that it is acting neither in the interests of capital nor on behalf of partisan politicians but in the public interest. Its autonomization strategies in turn feed back onto the modes of embeddedness and governing techniques the Fed deploys, often resulting in new controversies. Combining insights from economic sociology and the sociology of expertise, the perspective developed in this dissertation thus foregrounds the persistent tension between embeddedness and autonomy and the sequences of reiterated problem-solving it gives rise to.Based on extensive archival research and interviews with actors, I reconstruct three such sequences in the Fed’s more-than-a-century long quest for embedded autonomy in three independent but related empirical essays. The first focuses on the decade immediately following the Federal Reserve System’s founding in 1913. It traces how the confluence of democratic turmoil in the wake of World War I, its hybrid organizational structure, and an alliance with institutionalist economists led Fed policymakers to repurpose open market operations from a banking technique into a policy tool that reconciled different interests. This made it possible to take on a task no other central bank had attempted before: mitigating depressions. This major innovation briefly turned the Fed into “the chief stabilizer” before it failed to fulfill this role during the Great Depression. The essay thus adds a critical, oft-forgotten episode to the genealogy of the Fed’s ascendancy and the rise of central banks to the foremost macroeconomic managers of our time. The second essay most explicitly develops the theoretical argument underlying this dissertation and applies it to a practice that has been all but ignored in the scholarship on central banking and financial government: bank supervision. Emphasizing its distinctiveness from regulation, I reconstruct how the Fed folded supervision into its project of governing finance as a vital, yet vulnerable system over the course of the second half of the 20th century and into the 21st. I especially focus on the Fed’s autonomization strategies in the wake of the 2008 Great Financial Crisis and its internal struggles which resulted in a more standardized, quantitative, and transparent supervisory process centered around the technique of stress testing. However, the Fed’s efforts to reassert its autonomy and authority have in the meantime become attacked themselves. The essay traces these controversies, and subsequent reforms, to the present day, further demonstrating the recursive dynamic of the Fed’s quest for embedded autonomy. The third essay finally zooms in on a single event during the Great Financial Crisis: the first major public stress test run by the Fed and the Treasury between February and May 2009. By reconstructing its socio-technical assembling in detail and comparing it to the failures of stress tests run by European agencies between 2009 and 2011, I show that the stress test’s success rested on a reconfiguration of the state’s embeddedness in financial circuits, allowing the Treasury’s material and symbolic capital to back the exercise and the Fed to function as a conduit that iteratively gauged and shaped its audiences’ expectations as to what a credible test would look like. This made it possible to successfully frame the test as an autonomous exercise based on expertise. Probing the structural, socio-technical, and performative conditions of the Fed’s claims to legitimacy, the essay thus resolves the ‘mystery’ (Paul Krugman) how a simulation technique could become a watershed event in the greatest financial crisis in a lifetime

    新的加拿大示范投资协定:悄然改变

    Get PDF
    This Perspective explores the implications for the home countries of large MNEs of the agreement reached by over 140 countries in 2021 to enact a corporate minimum tax of 15%. It argues that the corporate minimum tax complements the trend to reduce the negative impact of unfettered globalization on labor, and it protects the ability of home countries to finance a robust social safety net. Home countries should adopt the corporate minimum tax, and that includes the US, which last year failed to adapt its Global Intangible Low-Taxed Income approach to the corporate minimum tax

    政府和企业需在石油资产交易时关注气候和治理风险

    Get PDF
    This Perspective explores the implications for the home countries of large MNEs of the agreement reached by over 140 countries in 2021 to enact a corporate minimum tax of 15%. It argues that the corporate minimum tax complements the trend to reduce the negative impact of unfettered globalization on labor, and it protects the ability of home countries to finance a robust social safety net. Home countries should adopt the corporate minimum tax, and that includes the US, which last year failed to adapt its Global Intangible Low-Taxed Income approach to the corporate minimum tax

    如何获得针对大规模外国直接投资激励的最佳交易

    Get PDF
    This Perspective explores the implications for the home countries of large MNEs of the agreement reached by over 140 countries in 2021 to enact a corporate minimum tax of 15%. It argues that the corporate minimum tax complements the trend to reduce the negative impact of unfettered globalization on labor, and it protects the ability of home countries to finance a robust social safety net. Home countries should adopt the corporate minimum tax, and that includes the US, which last year failed to adapt its Global Intangible Low-Taxed Income approach to the corporate minimum tax

    Thermal Control and Optimization for Assembled Photonic Interconnect Systems

    No full text
    In recent years, there has been significant progress in the development of integrated photonic circuits (PICs). Matured fabrication and simulation techniques have enabled the development of novel devices and system architectures. Ideally, these newly developed technologies are put to test in the lab, both to verify that they perform as simulated and to demonstrate the viability of the technology. Testing the increasingly complex optical circuits brings various challenges. One of these challenges is the sensitivity to temperature changes of many optical circuits, especially micro ring and micro disk resonators (MRRs and MDRs). Due to the nature of these resonators, slight deviations in the material properties have a large impact on their resonant frequency. Despite this, their small footprint and wavelength selectivity makes them promising components for many future technologies, especially Dense Wavelength Division Multiplexed (DWDM) communication links. Multiple resonators cascaded on a single bus waveguide can operate on multiple wavelengths simultaneously with relatively few components and in a small combined area. Since every extra connection to a PIC has a footprint similar to that of a micro resonator, a packaging optimized thermal control scheme is needed to fully leverage all advantages of micro resonators. This work will focus on the thermal stabilization of cascaded micro resonators and how thermal control can be optimized to simplify the packaging of PIC prototypes. This simplification enables the demonstration of complex systems and more realistic scenarios for thermal control of both resonators and other circuits. It will first show how a number of PICs and their respective packages were built, keeping subsequent testing in mind. Then, it demonstrates automatic initialization of cascaded MRR and how stable operation, while undergoing large temperature swings, can be achieved using a minimum number of connections to the PIC. Next, it shows stable operation of an eight-wavelength receiver, operating uncooled at 16 Gb/s/?, over a record 75 °C. Finally, it presents how all the learned lessons are brought together to built a 2.5D integrated SiPh transceiver that is capable of transmitting 512 Gb/s bidirectionally. This transceiver can be plugged into Field Programmable Gate Arrays (FPGAs), which can then be used to implement accelerators for real computing problems, used as a PCIe bridge to a standard compute server, or both. The transceiver is also designed to work with many types of optical switches, allowing demonstrations of novel switching algorithms and network architectures. The contributions discussed in this thesis can assist in enabling future high bandwidth optical interfaces by optimizing the thermal control strategy and may be used at all stages of PIC design and packaging to facilitate the development of new technologies

    When All You Have is a Hammer: Beyond Schenker’s Urlinie

    Get PDF
    This dissertation raises problems with the methodology of Schenkerian analysis and attempts to find solutions. Balancing ideological critique, intellectual history, and musical analysis with criticisms of the theory itself, my aim is to argue that Schenker’s conception of his own theory—and in particular of his background Urlinie—has greatly restricted the analytical possibilities of his methodology, including how it is practiced today. Chapter 1 will introduce and summarize this argument as a whole, including some of my musical and philosophical priors. Chapter 2 will explore how Schenker’s assumptions about his theory fulfill what I will (following Nicholas Cook) characterize as a “retrospective prophecy,” which I argue is continuous with the motivated and circular reasoning that propped up Schenker’s problematic wider ideology. Chapter 3 introduces two ideas that combat this tendency and inform my own analyses: the “modified” approaches to Schenker (in the work of scholars like David Neumeyer) as well as “oscillation” between different analytical interpretations (as Marianne Kielian-Gilbert describes work like Joseph Dubiel’s). Chapter 4 attempts to bring these two together, and I argue that this opens the way to a greater degree of analytical possibility than has hitherto been realized. In short, I argue that contemporary Schenkerian analysis has remained too tied to Schenker’s original formulation, to the detriment of the theory, but that not many Schenkerians have appreciated this. Along with attempting to find solutions to my problems, then, I want to convince Schenkerians that there are problems here to be solved in the first place, by showing how the very things analysts find valuable about the theory can be improved through the incorporation of “modified” and/or “oscillatory” approaches

    33,104

    full texts

    39,610

    metadata records
    Updated in last 30 days.
    Columbia University Academic Commons is based in United States
    Access Repository Dashboard
    Do you manage Open Research Online? Become a CORE Member to access insider analytics, issue reports and manage access to outputs from your repository in the CORE Repository Dashboard! 👇