805 research outputs found

    Evolution of South Atlantic density and chemical stratification across the last deglaciation

    Get PDF
    This is the author accepted manuscript. The final version is available from the National Academy of Sciences via the DOI in this recordExplanations of the glacial-interglacial variations in atmospheric pCO2invoke a significant role for the deep ocean in the storage of CO2. Deep-ocean density stratification has been proposed as a mechanism to promote the storage of CO2in the deep ocean during glacial times. A wealth of proxy data supports the presence of a "chemical divide" between intermediate and deep water in the glacial Atlantic Ocean, which indirectly points to an increase in deep-ocean density stratification. However, direct observational evidence of changes in the primary controls of ocean density stratification, i.e., temperature and salinity, remain scarce. Here, we use Mg/Ca-derived seawater temperature and salinity estimates determined from temperature-corrected δ18O measurements on the benthic foraminifer Uvigerina spp. from deep and intermediate water-depth marine sediment cores to reconstruct the changes in density of sub-Antarctic South Atlantic water masses over the last deglaciation (i.e., 22-2 ka before present). We find that a major breakdown in the physical density stratification significantly lags the breakdown of the deep-intermediate chemical divide, as indicated by the chemical tracers of benthic foraminifer δ13C and foraminifer/coral14C. Our results indicate that chemical destratification likely resulted in the first rise in atmospheric pCO2, whereas the density destratification of the deep South Atlantic lags the second rise in atmospheric pCO2during the late deglacial period. Our findings emphasize that the physical and chemical destratification of the ocean are not as tightly coupled as generally assumed.J.R. was funded jointly by the British Geological Survey/British Antarctic Survey (Natural Environment Research Council) and the University of Cambridge. J.G. was funded by the Gates Cambridge Trust. L.C.S. acknowledges support from the Royal Society and NERC Grant NE/J010545/1. C.W. acknowledges support from the European Research Council Grant ACCLIMATE 339108. This work was funded (in part) by the European Research Council (ERC Grant 2010-NEWLOG ADG-267931 HE). N.V.R. acknowledges support from EU RTN NICE (36127)

    Emotions and Digital Well-being. The rationalistic bias of social media design in online deliberations

    Get PDF
    In this chapter we argue that emotions are mediated in an incomplete way in online social media because of the heavy reliance on textual messages which fosters a rationalistic bias and an inclination towards less nuanced emotional expressions. This incompleteness can happen either by obscuring emotions, showing less than the original intensity, misinterpreting emotions, or eliciting emotions without feedback and context. Online interactions and deliberations tend to contribute rather than overcome stalemates and informational bubbles, partially due to prevalence of anti-social emotions. It is tempting to see emotions as being the cause of the problem of online verbal aggression and bullying. However, we argue that social media are actually designed in a predominantly rationalistic way, because of the reliance on text-based communication, thereby filtering out social emotions and leaving space for easily expressed antisocial emotions. Based on research on emotions that sees these as key ingredients to moral interaction and deliberation, as well as on research on text-based versus non-verbal communication, we propose a richer understanding of emotions, requiring different designs of online deliberation platforms. We propose that such designs should move from text-centred designs and should find ways to incorporate the complete expression of the full range of human emotions so that these can play a constructive role in online deliberations

    Emergent Gauge Fields in Holographic Superconductors

    Full text link
    Holographic superconductors have been studied so far in the absence of dynamical electromagnetic fields, namely in the limit in which they coincide with holographic superfluids. It is possible, however, to introduce dynamical gauge fields if a Neumann-type boundary condition is imposed on the AdS-boundary. In 3+1 dimensions, the dual theory is a 2+1 dimensional CFT whose spectrum contains a massless gauge field, signaling the emergence of a gauge symmetry. We study the impact of a dynamical gauge field in vortex configurations where it is known to significantly affect the energetics and phase transitions. We calculate the critical magnetic fields H_c1 and H_c2, obtaining that holographic superconductors are of Type II (H_c1 < H_c2). We extend the study to 4+1 dimensions where the gauge field does not appear as an emergent phenomena, but can be introduced, by a proper renormalization, as an external dynamical field. We also compare our predictions with those arising from a Ginzburg-Landau theory and identify the generic properties of Abrikosov vortices in holographic models.Comment: 19 pages, 14 figures, few comments added, version published in JHE

    Limited Projection 3D X-Ray Tomography Using the Maximum Entropy Method

    Full text link
    It is well known from physics that the reconstruction of physical quantities from experimental data is often obstructed by incomplete information, the presence of noise and the ill-posed nature of the inversion problem. It was shown [1] that a Bayesian reconstruction (BR) in terms of the Maximum Entropy Method (MEM) combined with unbiased a priori knowledge, if available, is one way to overcome the difficulties. Similar problems occur while extracting useful information from incomplete data sets in technical applications. The solution of the deduced iteration procedure, if converged, gives the most probable one among all possible solutions. In case of radiographic techniques difficulties occur if there is no free access around the object or if the number of available radiographic projections is limited due to other reasons like restricted maximum exposure as often required for medical applications or economical aspects. This situation, characterized by a significant lack of data, makes it impossible to apply reconstruction algorithms which are usually used for computer tomography (CT). Other reconstruction algorithms can be found by introducing prior information (compare [1–6]) about the object and the structures of interest. Those algorithms meet practical requirements like robustness, reduction of experimental and numerical effort, or others. For NDE applications, e.g. the inspection of welds or castings, prior knowledge can be introduced from a practical point of view by assuming a binary or multi-material structure. This reduces significantly the number of permissible solutions and therefore the number of required radiographie projections.</p

    Deglacial changes in flow and frontal structure through the Drake Passage

    Get PDF
    © 2017 Elsevier B.V. The oceanic gateways of the Drake Passage and the Agulhas Current are critical locations for the inflow of intermediate-depth water masses to the Atlantic, which contribute to the shallow return flow that balances the export of deep water from the North Atlantic. The thermohaline properties of northward flowing intermediate water are ultimately determined by the inflow of water through oceanic gateways. Here, we focus on the less well-studied “Cold Water Route” through the Drake Passage. We present millennially-resolved bottom current flow speed and sea surface temperature records downstream of the Drake Passage spanning the last 25,000 yr. We find that prior to 15 ka, bottom current flow speeds at sites in the Drake Passage region were dissimilar and there was a marked anti-phasing between sea surface temperatures at sites upstream and downstream of the Drake Passage. After 14 ka, we observe a remarkable convergence of flow speeds coupled with a sea surface temperature phase change at sites upstream and downstream of Drake Passage. We interpret this convergence as evidence for a significant southward shift of the sub-Antarctic Front from a position north of Drake Passage. This southward shift increased the through-flow of water from the Pacific, likely reducing the density of Atlantic Intermediate Water. The timing of the southward shift in the sub-Antarctic Front is synchronous with a major re-invigoration of Atlantic Meridional Overturning Circulation, with which, we argue, it may be linked

    Clinical significance of VEGF-A, -C and -D expression in esophageal malignancies

    Get PDF
    Vascular endothelial growth factors ( VEGF)- A, - C and - D are members of the proangiogenic VEGF family of glycoproteins. VEGF-A is known to be the most important angiogenic factor under physiological and pathological conditions, while VEGF-C and VEGF-D are implicated in the development and sprouting of lymphatic vessels, so called lymphangiogenesis. Local tumor progression, lymph node metastases and hematogenous tumor spread are important prognostic factors for esophageal carcinoma ( EC), one of the most lethal malignancies throughout the world. We found solid evidence in the literature that VEGF expression contributes to tumor angiogenesis, tumor progression and lymph node metastasis in esophageal squamous cell carcinoma ( SCC), and many authors could show a prognostic value for VEGF-assessment. In adenocarcinoma (AC) of the esophagus angiogenic properties are acquired in early stages, particularly in precancerous lesions like Barrett's dysplasia. However, VEGF expression fails to give prognostic information in AC of the esophagus. VEGF-C and VEGF-D were detected in SCC and dysplastic lesions, but not in normal mucosa of the esophagus. VEGF-C expression might be associated with lymphatic tumor invasion, lymph node metastases and advanced disease in esophageal SCC and AC. Therapeutic interference with VEGF signaling may prove to be a promising way of anti-angiogenic co-treatment in esophageal carcinoma. However, concrete clinical data are still pending

    Epidemiology of Doublet/Multiplet Mutations in Lung Cancers: Evidence that a Subset Arises by Chronocoordinate Events

    Get PDF
    BACKGROUND: Evidence strongly suggests that spontaneous doublet mutations in normal mouse tissues generally arise from chronocoordinate events. These chronocoordinate mutations sometimes reflect "mutation showers", which are multiple chronocoordinate mutations spanning many kilobases. However, little is known about mutagenesis of doublet and multiplet mutations (domuplets) in human cancer. Lung cancer accounts for about 25% of all cancer deaths. Herein, we analyze the epidemiology of domuplets in the EGFR and TP53 genes in lung cancer. The EGFR gene is an oncogene in which doublets are generally driver plus driver mutations, while the TP53 gene is a tumor suppressor gene with a more typical situation in which doublets derive from a driver and passenger mutation. METHODOLOGY/PRINCIPAL FINDINGS: EGFR mutations identified by sequencing were collected from 66 published papers and our updated EGFR mutation database (www.egfr.org). TP53 mutations were collected from IARC version 12 (www-p53.iarc.fr). For EGFR and TP53 doublets, no clearly significant differences in race, ethnicity, gender and smoking status were observed. Doublets in the EGFR and TP53 genes in human lung cancer are elevated about eight- and three-fold, respectively, relative to spontaneous doublets in mouse (6% and 2.3% versus 0.7%). CONCLUSIONS/SIGNIFICANCE: Although no one characteristic is definitive, the aggregate properties of doublet and multiplet mutations in lung cancer are consistent with a subset derived from chronocoordinate events in the EGFR gene: i) the eight frameshift doublets (present in 0.5% of all patients with EGFR mutations) are clustered and produce a net in-frame change; ii) about 32% of doublets are very closely spaced (< or =30 nt); and iii) multiplets contain two or more closely spaced mutations. TP53 mutations in lung cancer are very closely spaced (< or =30 nt) in 33% of doublets, and multiplets generally contain two or more very closely spaced mutations. Work in model systems is necessary to confirm the significance of chronocoordinate events in lung and other cancers

    Feasibility of an automated interview grounded in multiple mini interview (MMI) methodology for selection into the health professions: an international multimethod evaluation.

    Get PDF
    OBJECTIVES: Global, COVID-driven restrictions around face-to-face interviews for healthcare student selection have forced admission staff to rapidly adopt adapted online systems before supporting evidence is available. We have developed, what we believe is, the first automated interview grounded in multiple mini-interview (MMI) methodology. This study aimed to explore test-retest reliability, acceptability and usability of the system. DESIGN, SETTING AND PARTICIPANTS: Multimethod feasibility study in Physician Associate programmes from two UK and one US university during 2019-2020. PRIMARY, SECONDARY OUTCOMES: Feasibility measures (test-retest reliability, acceptability and usability) were assessed using intraclass correlation (ICC), descriptive statistics, thematic and content analysis. METHODS: Volunteers took (T1), then repeated (T2), the automated MMI, with a 7-day interval (±2) then completed an evaluation questionnaire. Admission staff participated in focus group discussions. RESULTS: Sixty-two students and seven admission staff participated; 34 students and 4 staff from UK and 28 students and 3 staff from US universities. Good-excellent test-retest reliability was observed at two sites (US and UK2) with T1 and T2 ICC between 0.65 and 0.81 (p<0.001) when assessed by individual total scores (range 80.6-119), station total scores 0.6-0.91, p<0.005 and individual site (≥0.79 p<0.001). Mean test re-test ICC across all three sites was 0.82 p<0.001 (95% CI 0.7 to 0.9). Admission staff reported potential to reduce resource costs and bias through a more objective screening tool for preselection or to replace some MMI stations in a 'hybrid model'. Maintaining human interaction through 'touch points' was considered essential. Users positively evaluated the system, stating it was intuitive with an accessible interface. Concepts chosen for dynamic probing needed to be appropriately tailored. CONCLUSION: These preliminary findings suggest that the system is reliable, generating consistent scores for candidates and is acceptable to end users provided human touchpoints are maintained. Thus, there is evidence for the potential of such an automated system to augment healthcare student selection
    • …
    corecore