652 research outputs found

    Urea impedes the hydrophobic collapse of partially unfolded proteins.

    Get PDF
    AbstractProteins are denatured in aqueous urea solution. The nature of the molecular driving forces has received substantial attention in the past, whereas the question how urea acts at different phases of unfolding is not yet well understood at the atomic level. In particular, it is unclear whether urea actively attacks folded proteins or instead stabilizes unfolded conformations. Here we investigated the effect of urea at different phases of unfolding by molecular dynamics simulations, and the behavior of partially unfolded states in both aqueous urea solution and in pure water was compared. Whereas the partially unfolded protein in water exhibited hydrophobic collapses as primary refolding events, it remained stable or even underwent further unfolding steps in aqueous urea solution. Further, initial unfolding steps of the folded protein were found not to be triggered by urea, but instead, stabilized. The underlying mechanism of this stabilization is a favorable interaction of urea with transiently exposed, less-polar residues and the protein backbone, thereby impeding back-reactions. Taken together, these results suggest that, quite generally, urea-induced protein unfolding proceeds primarily not by active attack. Rather, thermal fluctuations toward the unfolded state are stabilized and the hydrophobic collapse of partially unfolded proteins toward the native state is impeded. As a result, the equilibrium is shifted toward the unfolded state

    A case study on variability management in software product lines: identifying why real-life projects fail

    Get PDF
    Economies of scale can be seen as some kind of “holy grail” in state of the art literature on the development of sets of related software systems. Software product line methods are often mentioned in this context, due to the variability management aspects they propose, in order to deal with sets of related software systems. They realize the sought-after reusability. Both variability management and software product lines already have a strong presence in theoretical research, but in real-life software product line projects trying to obtain economies of scale still tend to fall short of target. The objective of this paper is to study this gap between theory and reality through a case study in order to see why such gap exists, and to find a way to bridge this gap. Through analysis of the causes of failure identified by the stakeholders in the case study, the underlying problem, which is found to be located in the requirements engineering phase, is crystallized. The identification of a framework describing the problems will provide practitioners with a better focus for future endeavors in the field of software product lines, so that economies of scale can be achieved

    Modellierung des Sorptionverhaltens von zwei beispielhaften persistenten organischen Schadstoffen an die Bodenfestphase basierend auf spektralen Daten und multivariater statistischer Analysen

    Get PDF
    Persistente organische Schadstoffe (POPs) stellen eine große Gefahr dar. Vor diesem Hintergrund ist ein Risikomanagement über den Verbleib der POPs im Boden von großem Interesse. Deshalb war es das Ziel dieser Studie ein zuverlässiges, schnelles und kostengünstiges Verfahren zu entwickeln, das die Sorption und Desorption an zwei modellhaften POPs vorhersagen kann. Dazu wurden 4-n-Nonylphenol (NP) und Perfluoroctansäure (PFOA) ausgewählt. Für die Entwicklung eines Prognosemodells wurden 72 Bodenproben zunächst nach Textur, Kohlenstoff-, Eisenoxid- und Manganoxidgehalt sowie pH-Wert hin konventionell analysiert. Von allen Bodenproben wurden darüber hinaus FTIR-Spektren im mittleren Infraroten Bereich gemessen. Die Sorption der Schadstoffe an den Boden wurde mit Hilfe von Batch Versuchen erfasst. Auf der Grundlage der FTIR Daten, welche Summenparameter der chemischen Bodenzusammensetzung darstellen, wurde, unter einer breiten Variation von statistischen Auswertungen, ein Vorhersagemodell zur Verbreitung der Tenside im Boden erstellt. Grundsätzlich war die Sorption von Nonylphenol und Perfluoroctansäure linear über alle Bodenproben hinweg. Trotzdem konnten signifikante Unterschiede in den Sorptionskoeffizienten (KD-Werten) festgestellt werden. Während für PFOA die KD-Werte im Bereich zwischen 2 und 80 ml g-1 lagen, wurden für NP Werte von 25 bis 1000 ml g-1 gemessen. Mit Hilfe einer multiplen Regressionsanalyse konnte bestimmt werden, dass die Sorption im Fall von PFOA signifikant von den Eisenoxidgehalten sowie den organischen Kohlenstoffgehalten abhängig war. Bei Nonylphenol war lediglich eine Abhängigkeit von den Corg-Gehalten zu beobachten. Da verschiedene Adsorptionsmechanismen der beiden POPs zu beobachten waren, wurden zwei unterschiedliche Prognosemodelle entwickelt. Auf Grundlage der Tatsache, dass die spektralen Daten Informationen zu der chemischen Bodenzusammensetzung aufweisen, wurden die konventionell erlangten Ergebnisse gegen die spektralen Daten kalibriert. Dadurch konnten die spektralen Bereiche identifiziert werden, welche die Informationen zu den Eisenoxid- und organischen Kohlenstoffgehalten trugen. Die relevanten spektralen Bereiche wurden verwendet, um das Vorhersagemodell, basierend auf dem Random Forrest Algorithmus, aufzubauen

    The Hepatic Monocarboxylate Transporter 1 (MCT1) Contributes to the Regulation of Food Anticipation in Mice.

    Get PDF
    Daily recurring events can be predicted by animals based on their internal circadian timing system. However, independently from the suprachiasmatic nuclei (SCN), the central pacemaker of the circadian system in mammals, restriction of food access to a particular time of day elicits food anticipatory activity (FAA). This suggests an involvement of other central and/or peripheral clocks as well as metabolic signals in this behavior. One of the metabolic signals that is important for FAA under combined caloric and temporal food restriction is β-hydroxybutyrate (βOHB). Here we show that the monocarboxylate transporter 1 (Mct1), which transports ketone bodies such as βOHB across membranes of various cell types, is involved in FAA. In particular, we show that lack of the Mct1 gene in the liver, but not in neuronal or glial cells, reduces FAA in mice. This is associated with a reduction of βOHB levels in the blood. Our observations suggest an important role of ketone bodies and its transporter Mct1 in FAA under caloric and temporal food restriction

    Impact of duration of chest tube drainage on pain after cardiac surgery

    Get PDF
    Objective: This study was designed to analyze the duration of chest tube drainage on pain intensity and distribution after cardiac surgery. Methods: Two groups of 80 cardiac surgery adult patients, operated on in two different hospitals, by the same group of cardiac surgeons, and with similar postoperative strategies, were compared. However, in one hospital (long drainage group), a conservative policy was adopted with the removal the chest tubes by postoperative day (POD) 2 or 3, while in the second hospital (short drainage group), all the drains were usually removed on POD 1. Results: There was a trend toward less pain in the short drainage group, with a statistically significant difference on POD 2 (P=0.047). There were less patients without pain on POD 3 in the long drainage group (P=0.01). The areas corresponding to the tract of the pleural tube, namely the epigastric area, the left basis of the thorax, and the left shoulder were more often involved in the long drainage group. There were three pneumonias in each group and no patient required repeated drainage. Conclusions: A policy of early chest drain ablation limits pain sensation and simplifies nursing care, without increasing the need for repeated pleural puncture. Therefore, a policy of short drainage after cardiac surgery should be recommende

    Kepler Presearch Data Conditioning II - A Bayesian Approach to Systematic Error Correction

    Full text link
    With the unprecedented photometric precision of the Kepler Spacecraft, significant systematic and stochastic errors on transit signal levels are observable in the Kepler photometric data. These errors, which include discontinuities, outliers, systematic trends and other instrumental signatures, obscure astrophysical signals. The Presearch Data Conditioning (PDC) module of the Kepler data analysis pipeline tries to remove these errors while preserving planet transits and other astrophysically interesting signals. The completely new noise and stellar variability regime observed in Kepler data poses a significant problem to standard cotrending methods such as SYSREM and TFA. Variable stars are often of particular astrophysical interest so the preservation of their signals is of significant importance to the astrophysical community. We present a Bayesian Maximum A Posteriori (MAP) approach where a subset of highly correlated and quiet stars is used to generate a cotrending basis vector set which is in turn used to establish a range of "reasonable" robust fit parameters. These robust fit parameters are then used to generate a Bayesian Prior and a Bayesian Posterior Probability Distribution Function (PDF) which when maximized finds the best fit that simultaneously removes systematic effects while reducing the signal distortion and noise injection which commonly afflicts simple least-squares (LS) fitting. A numerical and empirical approach is taken where the Bayesian Prior PDFs are generated from fits to the light curve distributions themselves.Comment: 43 pages, 21 figures, Submitted for publication in PASP. Also see companion paper "Kepler Presearch Data Conditioning I - Architecture and Algorithms for Error Correction in Kepler Light Curves" by Martin C. Stumpe, et a

    Corrigendum: Collective search by ants in microgravity

    Get PDF
    The problem of collective search is a tradeoff between searching thoroughly and covering as much area as possible. This tradeoff depends on the density of searchers. Solutions to the problem of collective search are currently of much interest in robotics and in the study of distributed algorithms, for example to design ways that without central control robots can use local information to perform search and rescue operations. Ant colonies operate without central control. Because they can perceive only local, mostly chemical and tactile cues, they must search collectively to find resources and to monitor the colony's environment. Examining how ants in diverse environments solve the problem of collective search can elucidate how evolution has led to diverse forms of collective behavior. An experiment on the International Space Station in January 2014 examined how ants (Tetramorium caespitum) perform collective search in microgravity. In the ISS experiment, the ants explored a small arena in which a barrier was lowered to increase the area and thus lower ant density. In microgravity, relative to ground controls, ants explored the area less thoroughly and took more convoluted paths. It appears that the difficulty of holding on to the surface interfered with the ants’ ability to search collectively. Ants frequently lost contact with the surface, but showed a remarkable ability to regain contact with the surface

    Kepler Presearch Data Conditioning I - Architecture and Algorithms for Error Correction in Kepler Light Curves

    Full text link
    Kepler provides light curves of 156,000 stars with unprecedented precision. However, the raw data as they come from the spacecraft contain significant systematic and stochastic errors. These errors, which include discontinuities, systematic trends, and outliers, obscure the astrophysical signals in the light curves. To correct these errors is the task of the Presearch Data Conditioning (PDC) module of the Kepler data analysis pipeline. The original version of PDC in Kepler did not meet the extremely high performance requirements for the detection of miniscule planet transits or highly accurate analysis of stellar activity and rotation. One particular deficiency was that astrophysical features were often removed as a side-effect to removal of errors. In this paper we introduce the completely new and significantly improved version of PDC which was implemented in Kepler SOC 8.0. This new PDC version, which utilizes a Bayesian approach for removal of systematics, reliably corrects errors in the light curves while at the same time preserving planet transits and other astrophysically interesting signals. We describe the architecture and the algorithms of this new PDC module, show typical errors encountered in Kepler data, and illustrate the corrections using real light curve examples.Comment: Submitted to PASP. Also see companion paper "Kepler Presearch Data Conditioning II - A Bayesian Approach to Systematic Error Correction" by Jeff C. Smith et a
    corecore