28 research outputs found

    Verbleib von Soziologie-AbsolventInnen der Philipps-Universität Marburg

    Get PDF
    Soziologie ist an der Philipps-Universität in den 1960er Jahren als akademisches Studienfach entstanden und ab 1972 ausgebaut worden. Neben den beiden Kernbereichen 'Soziologische Theorien' und 'Methoden empirischer Sozialforschung' konnten Studierende Ende der 1990er Jahre zwischen sechs speziellen Soziologien ihre Schwerpunktsetzung wählen. Nach 30 Jahren Soziologie-Studium in Marburg erschien es sinnvoll, Daten und Informationen über den Verbleib der bisherigen Absolventinnen und Absolventen zusammenzutragen. Dieser Aufgabe stellte sich eine Gruppe von Soziologiestudierenden im Grundstudium (2./3. bzw. 3./4. Fachsemester) zusammen mit ihrem Tutor und ihrer Dozentin im Rahmen eines über zwei Semester laufenden Empirischen Praktikums (Oktober 2003 bis Juli 2004). In Anlehnung an Verbleibs- und Studienabbruchs-Studien anderer Hochschulen wurden Fragestellungen für verschiedene Teiluntersuchungen und Erhebungsinstrumente entwickelt, Interviewtechniken trainiert, Daten erhoben und quantitativ bzw. qualitativ ausgewertet sowie der abschließende Forschungsbericht geschrieben. Die Grundgesamtheit wurde (wegen der schwierigen Adressrecherche) auf die Abschlussjahrgänge 1990 bis 2003 beschränkt. Durchgeführt wurden: eine postalische Fragebogenuntersuchung aller erreichbaren AbsolventInnen (realisiert 88 Befragungen); drei berufsbiografische Interviews mit AbsolventInnenen der Jahre 1991-1996-2001; vier perspektivische Leitfadeninterviews (problemzentriert) mit AbsolventInnen des Jahres 2003; zwei Leitfadeninterviews mit einem Abbrecher bzw. einem Studienfachwechsler; Experteninterviews mit potenziellen ArbeitgebervertreterInnen. Für die Analysen wurden je nach Datenstandard statistische Verfahren, qualitative Inhaltsanalyse oder sequenzielle Analyse eingesetzt. Die Befunde ermöglichen Einschätzungen hinsichtlich der damaligen Diplom- und Magisterstudiengänge in Marburg - hinsichtlich Bachelor- und Master-Studiengängen oder für andere Hochschulen müsste eine Studie entsprechend modifiziert werden

    State of the climate in 2018

    Get PDF
    In 2018, the dominant greenhouse gases released into Earth’s atmosphere—carbon dioxide, methane, and nitrous oxide—continued their increase. The annual global average carbon dioxide concentration at Earth’s surface was 407.4 ± 0.1 ppm, the highest in the modern instrumental record and in ice core records dating back 800 000 years. Combined, greenhouse gases and several halogenated gases contribute just over 3 W m−2 to radiative forcing and represent a nearly 43% increase since 1990. Carbon dioxide is responsible for about 65% of this radiative forcing. With a weak La Niña in early 2018 transitioning to a weak El Niño by the year’s end, the global surface (land and ocean) temperature was the fourth highest on record, with only 2015 through 2017 being warmer. Several European countries reported record high annual temperatures. There were also more high, and fewer low, temperature extremes than in nearly all of the 68-year extremes record. Madagascar recorded a record daily temperature of 40.5°C in Morondava in March, while South Korea set its record high of 41.0°C in August in Hongcheon. Nawabshah, Pakistan, recorded its highest temperature of 50.2°C, which may be a new daily world record for April. Globally, the annual lower troposphere temperature was third to seventh highest, depending on the dataset analyzed. The lower stratospheric temperature was approximately fifth lowest. The 2018 Arctic land surface temperature was 1.2°C above the 1981–2010 average, tying for third highest in the 118-year record, following 2016 and 2017. June’s Arctic snow cover extent was almost half of what it was 35 years ago. Across Greenland, however, regional summer temperatures were generally below or near average. Additionally, a satellite survey of 47 glaciers in Greenland indicated a net increase in area for the first time since records began in 1999. Increasing permafrost temperatures were reported at most observation sites in the Arctic, with the overall increase of 0.1°–0.2°C between 2017 and 2018 being comparable to the highest rate of warming ever observed in the region. On 17 March, Arctic sea ice extent marked the second smallest annual maximum in the 38-year record, larger than only 2017. The minimum extent in 2018 was reached on 19 September and again on 23 September, tying 2008 and 2010 for the sixth lowest extent on record. The 23 September date tied 1997 as the latest sea ice minimum date on record. First-year ice now dominates the ice cover, comprising 77% of the March 2018 ice pack compared to 55% during the 1980s. Because thinner, younger ice is more vulnerable to melting out in summer, this shift in sea ice age has contributed to the decreasing trend in minimum ice extent. Regionally, Bering Sea ice extent was at record lows for almost the entire 2017/18 ice season. For the Antarctic continent as a whole, 2018 was warmer than average. On the highest points of the Antarctic Plateau, the automatic weather station Relay (74°S) broke or tied six monthly temperature records throughout the year, with August breaking its record by nearly 8°C. However, cool conditions in the western Bellingshausen Sea and Amundsen Sea sector contributed to a low melt season overall for 2017/18. High SSTs contributed to low summer sea ice extent in the Ross and Weddell Seas in 2018, underpinning the second lowest Antarctic summer minimum sea ice extent on record. Despite conducive conditions for its formation, the ozone hole at its maximum extent in September was near the 2000–18 mean, likely due to an ongoing slow decline in stratospheric chlorine monoxide concentration. Across the oceans, globally averaged SST decreased slightly since the record El Niño year of 2016 but was still far above the climatological mean. On average, SST is increasing at a rate of 0.10° ± 0.01°C decade−1 since 1950. The warming appeared largest in the tropical Indian Ocean and smallest in the North Pacific. The deeper ocean continues to warm year after year. For the seventh consecutive year, global annual mean sea level became the highest in the 26-year record, rising to 81 mm above the 1993 average. As anticipated in a warming climate, the hydrological cycle over the ocean is accelerating: dry regions are becoming drier and wet regions rainier. Closer to the equator, 95 named tropical storms were observed during 2018, well above the 1981–2010 average of 82. Eleven tropical cyclones reached Saffir–Simpson scale Category 5 intensity. North Atlantic Major Hurricane Michael’s landfall intensity of 140 kt was the fourth strongest for any continental U.S. hurricane landfall in the 168-year record. Michael caused more than 30 fatalities and 25billion(U.S.dollars)indamages.InthewesternNorthPacific,SuperTyphoonMangkhutledto160fatalitiesand25 billion (U.S. dollars) in damages. In the western North Pacific, Super Typhoon Mangkhut led to 160 fatalities and 6 billion (U.S. dollars) in damages across the Philippines, Hong Kong, Macau, mainland China, Guam, and the Northern Mariana Islands. Tropical Storm Son-Tinh was responsible for 170 fatalities in Vietnam and Laos. Nearly all the islands of Micronesia experienced at least moderate impacts from various tropical cyclones. Across land, many areas around the globe received copious precipitation, notable at different time scales. Rodrigues and Réunion Island near southern Africa each reported their third wettest year on record. In Hawaii, 1262 mm precipitation at Waipā Gardens (Kauai) on 14–15 April set a new U.S. record for 24-h precipitation. In Brazil, the city of Belo Horizonte received nearly 75 mm of rain in just 20 minutes, nearly half its monthly average. Globally, fire activity during 2018 was the lowest since the start of the record in 1997, with a combined burned area of about 500 million hectares. This reinforced the long-term downward trend in fire emissions driven by changes in land use in frequently burning savannas. However, wildfires burned 3.5 million hectares across the United States, well above the 2000–10 average of 2.7 million hectares. Combined, U.S. wildfire damages for the 2017 and 2018 wildfire seasons exceeded $40 billion (U.S. dollars)

    Robust estimation of bacterial cell count from optical density

    Get PDF
    Optical density (OD) is widely used to estimate the density of cells in liquid culture, but cannot be compared between instruments without a standardized calibration protocol and is challenging to relate to actual cell count. We address this with an interlaboratory study comparing three simple, low-cost, and highly accessible OD calibration protocols across 244 laboratories, applied to eight strains of constitutive GFP-expressing E. coli. Based on our results, we recommend calibrating OD to estimated cell count using serial dilution of silica microspheres, which produces highly precise calibration (95.5% of residuals <1.2-fold), is easily assessed for quality control, also assesses instrument effective linear range, and can be combined with fluorescence calibration to obtain units of Molecules of Equivalent Fluorescein (MEFL) per cell, allowing direct comparison and data fusion with flow cytometry measurements: in our study, fluorescence per cell measurements showed only a 1.07-fold mean difference between plate reader and flow cytometry data

    Behavioral and electrophysiological responses to fairness norm violations in antisocial offenders

    Get PDF
    Antisocial personality disorder is characterized by a stable, lifelong pattern of disregard for and violation of others' rights. Disruptions in the representation of fairness norms may represent a key mechanism in the development and maintenance of this disorder. Here, we investigated fairness norm considerations and reactions to their violations. To examine electrophysiological correlates, we assessed the medial frontal negativity (MFN), an event-related potential previously linked to violations of social expectancy and norms. Incarcerated antisocial violent offenders (AVOs, n = 25) and healthy controls (CTLs, n = 24) acted as proposers in the dictator game (DG) and ultimatum game (UG) and received fair vs. unfair UG offers from either another human (social context) or a computer (non-social context). Results showed that AVOs made lower offers in the DG but not the UG, indicating more rational and strategic behavior. Most importantly, when acting as recipients in the UG, acceptance rates were modulated by social context in CTLs, while AVOs generally accepted more offers. Correspondingly, ERP data indicated pronounced MFN amplitudes following human offers in CTLs, whereas MFN amplitudes in AVOs were generally reduced. The current data suggest intact fairness norm representations but altered reactions to their violation in antisocial personality disorder

    Analysis of shared common genetic risk between amyotrophic lateral sclerosis and epilepsy

    Get PDF
    Because hyper-excitability has been shown to be a shared pathophysiological mechanism, we used the latest and largest genome-wide studies in amyotrophic lateral sclerosis (n = 36,052) and epilepsy (n = 38,349) to determine genetic overlap between these conditions. First, we showed no significant genetic correlation, also when binned on minor allele frequency. Second, we confirmed the absence of polygenic overlap using genomic risk score analysis. Finally, we did not identify pleiotropic variants in meta-analyses of the 2 diseases. Our findings indicate that amyotrophic lateral sclerosis and epilepsy do not share common genetic risk, showing that hyper-excitability in both disorders has distinct origins

    Auf Der Suche Nach Energiearmut: Eine Potentialanalyse Des Low-Income-High-Costs Indikators FFr Deutschland (Searching for Fuel Poverty A Potential Analysis of the Low-Income-High-Costs Indicator for Germany)

    No full text

    Dermatologist-like explainable AI enhances trust and confidence in diagnosing melanoma

    No full text
    Abstract Artificial intelligence (AI) systems have been shown to help dermatologists diagnose melanoma more accurately, however they lack transparency, hindering user acceptance. Explainable AI (XAI) methods can help to increase transparency, yet often lack precise, domain-specific explanations. Moreover, the impact of XAI methods on dermatologists’ decisions has not yet been evaluated. Building upon previous research, we introduce an XAI system that provides precise and domain-specific explanations alongside its differential diagnoses of melanomas and nevi. Through a three-phase study, we assess its impact on dermatologists’ diagnostic accuracy, diagnostic confidence, and trust in the XAI-support. Our results show strong alignment between XAI and dermatologist explanations. We also show that dermatologists’ confidence in their diagnoses, and their trust in the support system significantly increase with XAI compared to conventional AI. This study highlights dermatologists’ willingness to adopt such XAI systems, promoting future use in the clinic

    Using common genetic variants to find drugs for common epilepsies

    No full text
    Abstract Better drugs are needed for common epilepsies. Drug repurposing offers the potential of significant savings in the time and cost of developing new treatments. In order to select the best candidate drug(s) to repurpose for a disease, it is desirable to predict the relative clinical efficacy that drugs will have against the disease. Common epilepsy can be divided into different types and syndromes. Different antiseizure medications are most effective for different types and syndromes of common epilepsy. For predictions of antiepileptic efficacy to be clinically translatable, it is essential that the predictions are specific to each form of common epilepsy, and reflect the patterns of drug efficacy observed in clinical studies and practice. These requirements are not fulfilled by previously published drug predictions for epilepsy. We developed a novel method for predicting the relative efficacy of drugs against any common epilepsy, by using its Genome-Wide Association Study summary statistics and drugs’ activity data. The methodological advancement in our technique is that the drug predictions for a disease are based upon drugs’ effects on the function and abundance of proteins, and the magnitude and direction of those effects, relative to the importance, degree and direction of the proteins’ dysregulation in the disease. We used this method to predict the relative efficacy of all drugs, licensed for any condition, against each of the major types and syndromes of common epilepsy. Our predictions are concordant with findings from real-world experience and randomized clinical trials. Our method predicts the efficacy of existing antiseizure medications against common epilepsies; in this prediction, our method outperforms the best alternative existing method: area under receiver operating characteristic curve (mean ± standard deviation) 0.83 ± 0.03 and 0.63 ± 0.04, respectively. Importantly, our method predicts which antiseizure medications are amongst the more efficacious in clinical practice, and which antiseizure medications are amongst the less efficacious in clinical practice, for each of the main syndromes of common epilepsy, and it predicts the distinct order of efficacy of individual antiseizure medications in clinical trials of different common epilepsies. We identify promising candidate drugs for each of the major syndromes of common epilepsy. We screen five promising predicted drugs in an animal model: each exerts a significant dose-dependent effect upon seizures. Our predictions are a novel resource for selecting suitable candidate drugs that could potentially be repurposed for each of the major syndromes of common epilepsy. Our method is potentially generalizable to other complex diseases.</jats:p
    corecore