638 research outputs found
Antarctic intermediate water circulation in the South Atlantic over the past 25,000years
Antarctic Intermediate Water is an essential limb of the Atlantic meridional overturning circulation that redistributes heat and nutrients within the Atlantic Ocean. Existing reconstructions have yielded conflicting results on the history of Antarctic Intermediate Water penetration into the Atlantic across the most recent glacial termination. In this study we present leachate, foraminiferal, and detrital neodymium isotope data from three intermediate-depth cores collected from the southern Brazil margin in the South Atlantic covering the past 25kyr. These results reveal that strong chemical leaching following decarbonation does not extract past seawater neodymium composition in this location. The new foraminiferal records reveal no changes in seawater Nd isotopes during abrupt Northern Hemisphere cold events at these sites. We therefore conclude that there is no evidence for greater incursion of Antarctic Intermediate Water into the South Atlantic during either the Younger Dryas or Heinrich Stadial 1. We do, however, observe more radiogenic Nd isotope values in the intermediate-depth South Atlantic during the mid-Holocene. This radiogenic excursion coincides with evidence for a southward shift in the Southern Hemisphere westerlies that may have resulted in a greater entrainment of radiogenic Pacific-sourced water during intermediate water production in the Atlantic sector of the Southern Ocean. Our intermediate-depth records show similar values to a deglacial foraminiferal Nd isotope record from the deep South Atlantic during the Younger Dryas but are clearly distinct during the Last Glacial Maximum and Heinrich Stadial 1, demonstrating that the South Atlantic remained chemically stratified during Heinrich Stadial 1.Natural Environment Research Council (Grant IDs: NE/K005235/1, NE/F006047/1), National Science Foundation (Grant ID: OCE -1335191), Rutherford Memorial Scholarship, DFG Research Center/Cluster of Excellence “The Ocean in the Earth System”, FAPESP (Grant ID: 2012/17517-3), CAPES (Grant IDs: 1976/2014, 564/2015
BPMN task instance streaming for efficient micro-task crowdsourcing processes
The Business Process Model and Notation (BPMN) is a standard for modeling and executing business processes with human or machine tasks. The semantics of tasks is usually discrete: a task has exactly one start event and one end event; for multi-instance tasks, all instances must complete before an end event is emitted. We propose a new task type and streaming connector for crowdsourcing able to run hundreds or thousands of micro-task instances in parallel. The two constructs provide for task streaming semantics that is new to BPMN, enable the modeling and efficient enactment of complex crowdsourcing scenarios, and are applicable also beyond the special case of crowdsourcing. We implement the necessary design and runtime support on top of Crowd- Flower, demonstrate the viability of the approach via a case study, and report on a set of runtime performance experiments
Digital prosumption labour on social media in the context of the capitalist regime of time
So-called social media such as Facebook, Twitter, YouTube, Weibo and LinkedIn are an expression of changing regimes of time in capitalist society. This paper discusses how corporate social media are related to the capitalist organization of time and the changes this organization is undergoing. It uses social theory for conceptualizing changes of society and its time regime and how these changes shape social media. These changes have been described with notions such as prosumption, consumption labour, play labour (playbour) and digital labour. The paper contextualizes digital labour on social media with the help of a model of society that distinguishes three subsystems (the economy, politics, culture) and three forms of power (economic, political, culture). In modern society, these systems are based on the logic of the accumulation of power and the acceleration of accumulation. The paper discusses the role of various dimensions of time in capitalism with the help of a model that is grounded in Karl Marx’s works. It points out the importance of the category of time for a labour theory of value and a digital labour theory of value. Social media are expressions of the changing time regimes that modern society has been undergoing, especially in relation to the blurring of leisure and labour time (play labour), production and consumption time (prosumption), new forms of absolute and relative surplus value production, the acceleration of consumption with the help of targeted online advertising and the creation of speculative, future-oriented forms of fictitious capital
The need for multidisciplinarity in specialist training to optimize future patient care
Harmonious interactions between radiation, medical, interventional and surgical oncologists, as well as other members of multidisciplinary teams, are essential for the optimization of patient care in oncology. This multidisciplinary approach is particularly important in the current landscape, in which standard-of-care approaches to cancer treatment are evolving towards highly targeted treatments, precise image guidance and personalized cancer therapy. Herein, we highlight the importance of multidisciplinarity and interdisciplinarity at all levels of clinical oncology training. Potential deficits in the current career development pathways and suggested strategies to broaden clinical training and research are presented, with specific emphasis on the merits of trainee involvement in functional multidisciplinary teams. Finally, the importance of training in multidisciplinary research is discussed, with the expectation that this awareness will yield the most fertile ground for future discoveries. Our key message is for cancer professionals to fulfil their duty in ensuring that trainees appreciate the importance of multidisciplinary research and practice
Stormwater harvesting from landscaped areas:effect of herbicide application on water quality and usage
The suitability of stormwater harvested from pervious pavement system (PPS) structures for reuse purposes was investigated in conditions where glyphosate-containing herbicides (GCH) are applied as part of PPS maintenance procedure. The experiment was based on the four-layered design previously described as detailed in CIRIA C582. Results indicated that the highest sodium absorption ratio (SAR) of 1.6 recorded in this study, was less than that at which loss of permeability begins to occur as well as deterioration of matrix structure. Furthermore, the maximum electrical conductivity (ECw) of 2990 μS cm−1, recorded for 7200 mg L−1 concentration (GCH) was slightly below the unstable classification range at which salinity problems related to water quality occur such that salts accumulate in the root zone to the extent that crop yields are adversely affected. However, GCH concentration of 720 mg L−1 was within ‘permissible’ range while that of 72 mg L−1 was within ‘excellent’ range. Current study raises some environmental concerns owing to the overall impact that GCH at concentrations above 72 mg L−1 exerts on the net performance of the organic decomposers, heavy metal and hydrocarbon release from the system and thus, should be further investigated. However, effluent from all the test models including those dosed with high GCH concentration of 7200 mg L−1 do not pose any threat in terms of infiltration or deterioration associated with salinity although, there are indications that high dosage of the herbicide could lead to an elevated electrical conductivity of the recycled water.<br/
Methods to study splicing from high-throughput RNA Sequencing data
The development of novel high-throughput sequencing (HTS) methods for RNA
(RNA-Seq) has provided a very powerful mean to study splicing under multiple
conditions at unprecedented depth. However, the complexity of the information
to be analyzed has turned this into a challenging task. In the last few years,
a plethora of tools have been developed, allowing researchers to process
RNA-Seq data to study the expression of isoforms and splicing events, and their
relative changes under different conditions. We provide an overview of the
methods available to study splicing from short RNA-Seq data. We group the
methods according to the different questions they address: 1) Assignment of the
sequencing reads to their likely gene of origin. This is addressed by methods
that map reads to the genome and/or to the available gene annotations. 2)
Recovering the sequence of splicing events and isoforms. This is addressed by
transcript reconstruction and de novo assembly methods. 3) Quantification of
events and isoforms. Either after reconstructing transcripts or using an
annotation, many methods estimate the expression level or the relative usage of
isoforms and/or events. 4) Providing an isoform or event view of differential
splicing or expression. These include methods that compare relative
event/isoform abundance or isoform expression across two or more conditions. 5)
Visualizing splicing regulation. Various tools facilitate the visualization of
the RNA-Seq data in the context of alternative splicing. In this review, we do
not describe the specific mathematical models behind each method. Our aim is
rather to provide an overview that could serve as an entry point for users who
need to decide on a suitable tool for a specific analysis. We also attempt to
propose a classification of the tools according to the operations they do, to
facilitate the comparison and choice of methods.Comment: 31 pages, 1 figure, 9 tables. Small corrections adde
A Novel Semi-Supervised Methodology for Extracting Tumor Type-Specific MRS Sources in Human Brain Data
BackgroundThe clinical investigation of human brain tumors often starts with a non-invasive imaging study, providing information about the tumor extent and location, but little insight into the biochemistry of the analyzed tissue. Magnetic Resonance Spectroscopy can complement imaging by supplying a metabolic fingerprint of the tissue. This study analyzes single-voxel magnetic resonance spectra, which represent signal information in the frequency domain. Given that a single voxel may contain a heterogeneous mix of tissues, signal source identification is a relevant challenge for the problem of tumor type classification from the spectroscopic signal.Methodology/Principal FindingsNon-negative matrix factorization techniques have recently shown their potential for the identification of meaningful sources from brain tissue spectroscopy data. In this study, we use a convex variant of these methods that is capable of handling negatively-valued data and generating sources that can be interpreted as tumor class prototypes. A novel approach to convex non-negative matrix factorization is proposed, in which prior knowledge about class information is utilized in model optimization. Class-specific information is integrated into this semi-supervised process by setting the metric of a latent variable space where the matrix factorization is carried out. The reported experimental study comprises 196 cases from different tumor types drawn from two international, multi-center databases. The results indicate that the proposed approach outperforms a purely unsupervised process by achieving near perfect correlation of the extracted sources with the mean spectra of the tumor types. It also improves tissue type classification.Conclusions/SignificanceWe show that source extraction by unsupervised matrix factorization benefits from the integration of the available class information, so operating in a semi-supervised learning manner, for discriminative source identification and brain tumor labeling from single-voxel spectroscopy data. We are confident that the proposed methodology has wider applicability for biomedical signal processing
Toxoplasma Effector MAF1 Mediates Recruitment of Host Mitochondria and Impacts the Host Response
Recent information has revealed the functional diversity and importance of mitochondria in many cellular processes including orchestrating the innate immune response. Intriguingly, several infectious agents, such as Toxoplasma, Legionella, and Chlamydia, have been reported to grow within vacuoles surrounded by host mitochondria. Although many hypotheses have been proposed for the existence of host mitochondrial association (HMA), the causes and biological consequences of HMA have remained unanswered. Here we show that HMA is present in type I and III strains of Toxoplasma but missing in type II strains, both in vitro and in vivo. Analysis of F1 progeny from a type II×III cross revealed that HMA is a Mendelian trait that we could map. We use bioinformatics to select potential candidates and experimentally identify the polymorphic parasite protein involved, mitochondrial association factor 1 (MAF1). We show that introducing the type I (HMA+) MAF1 allele into type II (HMA-) parasites results in conversion to HMA+ and deletion of MAF1 in type I parasites results in a loss of HMA. We observe that the loss and gain of HMA are associated with alterations in the transcription of host cell immune genes and the in vivo cytokine response during murine infection. Lastly, we use exogenous expression of MAF1 to show that it binds host mitochondria and thus MAF1 is the parasite protein directly responsible for HMA. Our findings suggest that association with host mitochondria may represent a novel means by which Toxoplasma tachyzoites manipulate the host. The existence of naturally occurring HMA+ and HMA- strains of Toxoplasma, Legionella, and Chlamydia indicates the existence of evolutionary niches where HMA is either advantageous or disadvantageous, likely reflecting tradeoffs in metabolism, immune regulation, and other functions of mitochondria. © 2014 Pernas et al
Walk with Me: a protocol for a pilot RCT of a peer-led walking programme to increase physical activity in inactive older adults
Background: Levels of physical activity decline with age. Some of the most disadvantaged individuals in society, such as those from lower socio-economic position, are also the most inactive. Increasing physical activity levels, particularly among those most inactive, is a public health priority. Peer-led physical activity interventions may offer a model to increase physical activity in the older adult population. This study aims to test the feasibility of a peer-led, multicomponent physical activity intervention in socio-economically disadvantaged community dwelling older adults. Methods: The Medical Research Council framework for developing and evaluating complex interventions will be used to design and test the feasibility of a randomised controlled trial (RCT) of a multicomponent peer-led physical activity intervention. Data will be collected at baseline, immediately after the intervention (12 weeks) and 6 months after baseline measures. The pilot RCT will provide information on recruitment of peer mentors and participants and attrition rates, intervention fidelity, and data on the variability of the primary outcome (minutes of moderate to vigorous physical activity measured with an accelerometer). The pilot trail will also assess the acceptability of the intervention and identify potential resources needed to undertake a definitive study. Data analyses will be descriptive and include an evaluation of eligibility, recruitment, and retention rates. The findings will be used to estimate the sample size required for a definitive trial. A detailed process evaluation using qualitative and quantitative methods will be conducted with a variety of stakeholders to identify areas of success and necessary improvements. Discussion: This paper describes the protocol for the ‘Walk with Me’ pilot RCT which will provide the information necessary to inform the design and delivery of a fully powered trial should the Walk with Me intervention prove feasible
- …
