2,471 research outputs found

    Evaluating EU Regional Policy: Many Empirical Specifications, One (Unpleasant) Result

    Get PDF
    Numerous studies have focused on the role of EU regional policy in fostering growth and convergence among European regions, why conducting another one? We argue that two facts are still lacking in the actual academic debate in order to get a sound empirical identification strategy and reliable results: First, one should take the theoretical underpinnings of regional growth models more serious, and second, a likewise careful account of the role of spatial dependence in the underlying data is needed. Though research has increasingly become aware of the latter point as important control factor for regional heterogeneity and omitted variables, in empirical operationalization still the ad-hoc inclusion of a hardly interpretable ‘catch-all’ spatial lag term of the endogenous variable is the first choice. We rather follow the lines of new theoretical and empirical approaches aiming at directly quantifying interregional spillovers associated with the amount of funds granted to lagging regions and their neighborhood. The dataset includes 127 NUTS1/-2 regions within the EU15 over the decade 1997-2007. In the spotlight of the investigation are the Objective 1 payments which are provided for lagging regions with a GDP p.c. of less than 75% of the EU average. These payments shall represent the main instrument to fulfill the central aim of European regional policy, the boost of convergence and harmonic growth over the EU. They represent about two third of the whole European cohesion policy. In our estimations we run a neoclassical convergence model in mainly four different specifications. On the one hand we separate in the aspatial and spatial models. On the other hand we run additive and multiplicative applications in order to consider the right coefficient interpretations. We estimate the model in various econometric specifications to point out the effectiveness of these funding. Our results all hint to the unpleasant result that EU structural funds objective 1 funding has a remarkably little or even negative direct impact on regional growth within the EU15. The spatial funding effects turn into negative significance in the most model specifications.

    Role of "Intrinsic Charm" in Semi-Leptonic B-Meson Decays

    Get PDF
    We discuss the role of so-called "intrinsic-charm" operators in semi-leptonic B-meson decays, which appear first at order 1/m_b^3 in the heavy quark expansion. We show by explicit calculation that -- at scales mu <= m_c -- the contributions from "intrinsic-charm" effects can be absorbed into short-distance coefficient functions multiplying, for instance, the Darwin term. Then, the only remnant of "intrinsic charm" are logarithms of the form ln(m_c^2/m_b^2), which can be resummed by using renormalization-group techniques. As long as the dynamics at the charm-quark scale is perturbative, alpha_s(m_c) << 1, this implies that no additional non-perturbative matrix elements aside from the Darwin and the spin-orbit term have to be introduced at order 1/m_b^3. Hence, no sources for additional hadronic uncertainties have to be taken into account. Similar arguments may be made for higher orders in the 1/m_b expansion.Comment: 14 pages, 1 figure, uses slashed.sty, slight modifications to match published versio

    Dog electroencephalogram for early safety seizure liability assessments and investigation of species-specific sensitivity for neurological symptoms

    Get PDF
    Preclinical safety is an important part of drug development in animals and humans. In toxicology studies, seizure liability can be detected at high doses as convulsions. Non-convulsive seizures induce only subtle behavioral changes and their assessment in animals is challenging. Electroencephalography (EEG) is the only method to correlate animal behavior to seizure activity and video-EEG is the current gold-standard for preclinical seizure liability assessments (Authier et al., 2014b). In most cases there are no clear premonitory signs that forewarn of convulsions but epileptiform EEG activity prior to clinical manifestation has been reported during a period potentially sufficient for prophylactic anticonvulsive treatment (DĂŒrmĂŒller et al., 2007). Aim of this thesis was investigation of a study design for assessment of neurological symptoms in dogs. This design should optimize detection of neurological signs while minimizing study duration and animal numbers. Video-EEG was used to increase symptom detection rate and to explore the possibility to refine seizure liability testing by enabling EEG-based anticonvulsive treatment. For establishment of the EEG system in our facility, reference substances were tested first. Then, three in-house drug candidates with different modes of action and known neurological side effects were chosen. Two telemetered beagle dogs were used per experiment. Substance effects on clinical symptoms and on the EEG were investigated. CSF and blood samples for analysis of drug exposure and biomarkers were collected simultaneous to symptoms. Results were compared to previous toxicological studies thereby enabling evaluation of non-rodent species differences in sensitivity for neurological symptoms. Results showed that combination of implants for CSF collection and EEG recording is possible. In this study design, intravenous administration was superior to oral dosing as it led to a reduced variability in exposure levels. Also, experimental time was significantly reduced compared to standard toxicology studies while the same neurological symptoms were induced. This shortened duration enabled continuous clinical observations for a better evaluation of CNS effects and immediate veterinary assistance in the spirit of animal welfare. The EEG was not superior to clinical observations in forewarning of convulsion risk and did not enable convulsion prevention. This was due first to the short latency between onset of abnormal EEG activity and convulsions which was below one minute with in-house compounds. Secondly, accurate interpretation of the unfiltered EEG signal was limited, especially differentiation of artefacts and epileptiform activity. In conclusion, a study design using intravenous infusions is suitable for the characterization of neurological symptoms. All the symptoms, which were already known from studies with a longer duration, were also seen. This allowed better correlation of neurological symptoms to exposure and immediate veterinarian treatments. For substances with a high risk to induce severe neurological symptoms, such studies can guide dose selection for longer regulatory toxicological studies to prevent occurrence of severe neurological symptoms.Im Rahmen der Entwicklung von Human- und VeterinĂ€rarzneimitteln wird die Anwendersicherheit neuer Medikamente in prĂ€klinischen Sicherheitsstudien erforscht. Zentralnervöse Nebenwirkungen werden hĂ€ufig erst in toxikologischen PrĂŒfungen erkannt, wenn bei hohen Dosierungen KrampfanfĂ€lle bei den Versuchstieren auftreten. Epileptische AnfĂ€lle können allerdings auch subtilere Symptome, deren Erkennen in Tieren schwierig ist, verursachen. Die Elektroenzephalographie (EEG) bietet in Tierstudien die einzige Möglichkeit, nicht-konvulsive AnfĂ€lle zu diagnostizieren. Daher ist die Kombination von VideoĂŒberwachung und EEG in der prĂ€klinischen Arzneimittelentwicklung gegenwĂ€rtig der Goldstandard fĂŒr die Sicherheitsbewertung einer Substanz im Hinblick auf ihr Risiko, AnfĂ€lle auszulösen (Authier et al., 2014b). Meist gibt es keine klinischen Warnzeichen vor dem Auftreten von KrampfanfĂ€llen. Allerdings wurde das Auftreten epileptiformer EEG-AktivitĂ€t vor klinischen Symptomen beobachtet. Das beschriebene Zeitfenster ist potentiell ausreichend fĂŒr prophylaktische antikonvulsive Behandlung (DĂŒrmĂŒller et al., 2007). Ziel dieser Arbeit war es, in Pilotstudien ein neues Studiendesign fĂŒr die Charakterisierung neurologischer Nebenwirkungen zu evaluieren. Dieses Studiendesign sollte die Erkennungsrate neurologischer Nebenwirkungen optimieren und dabei gleichzeitig eine Reduktion der dazu nötigen Tiere und der Studiendauer ermöglichen. Der Einsatz von EEG und VideoĂŒberwachung sollte es ermöglichen, Substanz-induzierte AnfĂ€lle im FrĂŒhstadium zu erkennen und ihr klinisches Auftreten zu verhindern. Um das EEG-System in der Forschungseinrichtung neu zu etablieren und um zu evaluieren, ob Implantate fĂŒr Liquor-Entnahme und EEG-Aufzeichnung kompatibel sind, wurden zuerst Referenzsubstanzen getestet. Zur Beantwortung der eigentlichen Fragestellung wurden drei Arzneimittelkandidaten mit unterschiedlichen Wirkmechanismen ausgewĂ€hlt, von denen bekannt war, dass sie neurologische Symptome verursachen. Je Substanztest wurden zwei Hunde mit implantierten EEG-Sendern verwendet. Zwei der Substanzen wurden in eskalierenden intravenösen Dosen verabreicht, die dritte wurde als einzelne orale Dosis gegeben. Effekte der Substanzen auf klinische Symptome und auf das EEG wurden evaluiert. Parallel wurden Blut- und Liquor-Proben zur Bestimmung der Substanzspiegel und potentieller Biomarker genommen. Die Auswahl der Substanzen bot zusĂ€tzlich die Möglichkeit, die Empfindlichkeit der beiden regelmĂ€ĂŸig in ArzneimittelprĂŒfungen verwendeten Nicht-Nager Spezies Hund und Affe fĂŒr neurologische Symptome vergleichend zu bewerten. Die Ergebnisse zeigen, dass die Kombination von Implantaten fĂŒr EEG-Aufzeichnung und CSF-Probennahme möglich ist. Die intravenöse Applikation war der oralen Substanzgabe vorzuziehen, da die VariabilitĂ€t der Substanz-Plasmaspiegel geringer war. Alle Symptome, die aus frĂŒheren toxikologischen Studien mit lĂ€ngerer Dauer bekannt waren, wurden ebenso beobachtet. Durch das Dosierungsschema war ihr Auftreten allerdings auf eine verkĂŒrzte Zeitspanne reduziert. Die kurze Studiendauer ermöglichte durchgehende klinische Beobachtung, somit die Erkennung aller Symptome und zeitnahe veterinĂ€rmedizinische Behandlungen, was im Sinne des Tierschutzes einen Vorteil darstellt. FĂŒr eine frĂŒhzeitige Erkennung von KrampfanfĂ€llen war das EEG nicht besser geeignet als klinische Beobachtung, da die Interpretation des ungefilterten EEG Signals durch das Auftreten von Artefakten erschwert war. Das Studiendesign, in dem das EEG angewendet wurde, ist zur Charakterisierung neurologischer Nebenwirkungen geeignet, da alle Symptome, die aus Studien mit lĂ€ngerer Dauer bekannt waren, ebenso beobachtet wurden. Durch die verkĂŒrzte Dauer wurde ermöglicht, Symptome und Substanzplasmaspiegel zu korrelieren und zeitnahe tierĂ€rztliche Behandlungen durchzufĂŒhren. Bei Substanzen, die ein hohes Risiko fĂŒr neurologische Nebenwirkungen haben, kann dieses Studiendesign genutzt werden um im Vorfeld von behördlich geforderten toxikologischen Studien Dosierungen zu bestimmen, bei denen keine schweren neurologischen Nebenwirkungen zu erwarten sind

    High-resolution mapping of a QTL for Fusarium Head Blight resistance on chromosome 2A in Triticum monococcum

    Get PDF
    Die Sicherstellung und Erhöhung des Weizenertrags hat heutzutage weltweit große Bedeutung, um die ErnĂ€hrung der stetig wachsenden Gesellschaft zu sichern. Eine sehr bedeutende Krankheit im Weizen ist die Ährenfusariose (engl. Fusarium Head Blight, FHB), die durch verschiede Fusarium spp.- Pilze hervorgerufen wird. Diese kann zu Ertragsverlusten bis zu 40 % fĂŒhren und durch die Bildung von Mykotoxinen wĂ€hrend des Infektionszyklus, die QualitĂ€t mindern sowie die Gesundheit von Mensch und Tier gefĂ€hrden. In der folgenden Studie wurde das Resistenzverhalten im Einkorn (Triticum monococcum) gegenĂŒber Fusarium untersucht. Dazu wurde eine DH-population, bestehend aus 94 DH-Linien, erstellt und analysiert, die auf eine Kreuzung zwischen Triticum monococcum L. (mon10-1: moderates Resistenzverhalten) und Triticum monococcum L. conv. sinskayae (Sinskayae: anfĂ€llig) zurĂŒckgeht. Die DH-Population wurde in zweijĂ€hrigen Feldversuchen phĂ€notypisiert und mit DArT und SSR-Markern genotypisiert, was in einer genetischen Karte von 1987.55 cM resultierte. In einer anschließenden QTL-Analyse wurden zwei benachbarte QTL auf Chromosom 2A in einem Intervall von 45.1 cM (31.4 Mbp) kartiert. Mit der Methode der kartengestĂŒtzten Genisolierung wurde das QTL Intervall verkleinert um eng gekoppelte Marker oder Kandidatengene zu identifizieren, die diese Variation bewirken. Dazu wurde eine hochauflösende Kartierungspopulation, bestehend aus 1991 F2-Pflanzen erstellt, die auf eine Kreuzung zwischen zwei resistenten und drei anfĂ€lligen DH-Linien der ursprĂŒnglichen DH-Population zurĂŒckgeht. Es konnten 333 rekombinante Inzuchtlinien (RIL) identifiziert werden. Von diesen wurden 268 RILs in GewĂ€chshaus- und Feldversuchen mit dem Fusarium-Isolat Fc46 phĂ€notypisiert und mit 21, durch genotyping-by-sequencing (GBS), den 90K iSelect Chip und der genetischen Karte von Triticum monococcum, neu entwickelten KASP-Markern genotypisiert. Dennoch war es nicht möglich den Resistenzlocus innerhalb des Intervalls zu kartieren. Eine neue QTL-Analyse mit den physikalischen Positionen eines reduzierten Markersets aus der ursprĂŒnglichen DH-Population zeigte, dass sich die Peak-Marker in eine Region zwischen 499.25 – 607.96 Mbp verschieben. Ebenfalls wird das sog-Gen in dieser Region vermutet, welches verantwortlich fĂŒr die Ährenform von Triticum sinskayae ist. Es ist unklar, ob der beobachtete Effekt durch eine enge Kopplung beider Gene in dieser genomischen Region hervorgerufen wird oder durch Pleiotropie.Securing wheat production is of prime importance with regard to feeding the earth’s growing population. Wheat is threatened by a lot of abiotic and biotic factors leading to severe yield losses. One important disease is Fusarium Head Blight (FHB), caused by different Fusarium spp. The disease leads to yield losses up to 40 %, a reduction in quality and a health risk for mankind due to toxic secondary metabolites that arise during the infection process. Therefore, FHB belongs to the most important wheat diseases and is extensively studied worldwide. To improve resistance of wheat to Fusarium spp., this study was conducted to get detailed information on the genetics of a new source of resistance, detcted in Triticum monococcum, which is a close relative of bread wheat. To achieve this, a DH-population based on a cross between Triticum monococcum accession mon10-1, which is moderately resistant to FHB and the FHB susceptible Triticum monococcum L. conv. sinskayae (Sinskayae) comprising of 94 DH-lines was analysed. The population was phenotyped in two years field trials and genotyped by DArT analyses resulting in a genetic map of 1987.55 cM. Based on these data, two neighbouring QTLs were mapped in an interval of 45.1 cM on the short arm of chromosome 2A. Further analyses aimed at shortening the QTL interval and the identification of closely linked markers and candidate genes by a map-based cloning approach. A high-resolution mapping population was developd out of 1991 F2-plants, that traced back to crosses between three susceptible and two resistant DH-lines of the original population. 333 RILs were developed of which 268 were used for phenotypic evaluation with F. culmorum (Isolate: Fc46) in field and greenhouse trials. Marker saturation was conducted based on the 90K iSelect chip, genotyping-by-sequencing (GBS) and known genetic maps of Triticum monococcum. Out of these, 21 KASP markers were developed and mapped within the QTL interval. Assigning these markers to the physical map of T. aestivum resulted in an interval of 31.4 Mbp. However, by phenotyping respective segmental RILs, the resistance locus was not located within this interval. A new QTL analysis with a reduced marker set of the DH-mapping population using their physical postitions was conducted and resulted in a switch of the peak markers to a proximal region of chromosome 2A into an interval between 499.25 – 607.96 Mbp. This QTL mapped in the same region like the soft glume (sog)-gene, but it is unclear if the QTL effect is due to tight linkage between sog- and FHB resistance gene or pleiotropy

    Some Recent Trends in State Liability for Tort

    Get PDF

    Rice plants, drainage and crop rotation influence the methanogenic community in rice field soil

    Get PDF
    The continuous flooding of rice fields results in anoxic conditions in the soil, creating an optimal habitat for anaerobic bacteria and methanogenic archaea. Furthermore, rice plants supply important nutrients for soil microbes by significantly contributing to the carbon pool by excreting carbon compounds through their root system. It is assumed that this supply of nutrients from the rice plants influences the microbial community structure and diversity, but this influence is poorly understood. The first part of this thesis investigates the impact of the rice plant and its growth stages on the microbial community inhabiting flooded rice field soil. In a greenhouse experiment we showed that the presence of the rice plant leads to increased growth of both Archaea and Bacteria by detecting a doubling of the 16S rRNA gene copies. The overall microbial community composition was mainly similar in planted and unplanted soil. However, specific bacterial lineages were more abundant in the presence of the rice plant (e.g Geobacter). In the planted soil major OTUs increased in relative abundance with plant growth stage, indicating that the rice growth stages and dynamics in root exudation influenced the microbial community. Together, these results suggest that the microbial community in the rice field soil is highly adapted to the presence of rice plants, possibly because of the plant-supplied carbon compounds in the soil. The traditional method for rice cultivation is the flooding of the field. However, with the anticipated increase in the human population the demand on resources such as water will increase. Therefore, rice farmers will probably face periods of restricted water availability. A method decreasing the water demand of rice cultivation is the rotation with plants cultivated under upland conditions such as maize, which require less water. Therefore, the second part of this thesis deals with the influence of the rice plant growth stages, field conditions and maize cultivation on the microbial community in rice field soil. During the plant growth stages we detected only minor changes in abundance, composition and activity of both archaeal and bacterial communities. In contrast, changes in field management such as drainage and the cultivation of maize resulted in comparatively stronger changes in the bacterial community. Bacterial lineages that increased in relative abundance under non-flooded conditions were either aerobes such as Spartobacteria and Sphingobacteria or were characterized by their ability to grow under low substrate conditions such as Bacteroidetes and Acidobacteria. Besides archaeal lineages commonly found in rice fields (Methanosarcinaceae, Methanosaetaceae, Methanobacteriaceae and Methanocellaceae) we found notably high numbers of GOM Arc I species within the order of Methanosarcinales, which may be anaerobic methane oxidizers. The archaeal community remained mainly unchanged throughout the monitored season. Interestingly, we observed increased ribosomal RNA levels per cell under the drained conditions. As these conditions were unfavorable for anaerobic bacteria and methanogenic archaea we interpreted this behavior as preparedness for becoming active when conditions improve. In the third part of the thesis we followed the introduction of maize cultivation and concomitant non-flooded conditions on fields that had previously been managed as flooded rice fields. The crop rotation was monitored for two additional years. Thereby we found only minor differences in the bacterial community abundance and activity in the rotational fields in comparison to flooded rice fields. Acidobacteria and Anaeromyxobacter spp. were enriched in the rotational fields while members of anaerobic Chloroflexi and sulfite reducing members of Deltaproteobacteria were found in higher abundance in the rice fields. In contrast, we showed that rotation of flooded rice and upland maize lead to dramatic changes in the archaeal community, indicated by a decrease of anaerobic methanogenic lineages and an increase of aerobic Thaumarchaeota. This was especially apparent in the strong enrichment of Thaumarchaeota of the Soil Crenarchaeotic Group, mainly Candidatus Nitrososphaera, indicating the increasing importance of ammonia oxidation during drainage. Combining qPCR and pyrosequencing data again revealed increased ribosomal numbers per cell for methanogenic species during crop rotation. This stress response, however, did not allow the methanogenic community to recover in the rotational fields during the season of re-flooding and rice cultivation. This thesis provides evidence that the rice plants influence the microbial community in the soil (first part), and that alterations in field management such as drainage or maize cultivation under upland conditions have minor immediate effects on the overall microbial community (second part) but more strongly pronounced long term effects mainly on the archaeal community (third part)
    • 

    corecore