68 research outputs found

    OER Handbook for Educators

    Get PDF
    Welcome to the world of Open Educational Resources (OER). This handbook is designed to help educators find, use, develop and share OER to enhance their effectiveness online and in the classroom.Although no prior knowledge of OER is required, some experience using a computer and browsing the Internet will be helpful. For example, it is preferable that you have experience using a word processor (e.g. Open Office or Microsoft Word) and basic media production software, such as an image editor (e.g. Gimp, Inkscape or Photoshop).The handbook works best when there is some sort of OER you would like to create or make available to others, but it is also useful for the curious reader.There are several ways to use this handbook, including:Cover-to-cover, which is intended for newcomers who want to gain an understanding of OER and engage in the whole development cycle (find, compose, adapt, use, share, ...) in a real world setting;Individual sections, as a quick reference for educators engaged in OER development looking for pointers at any stage in the OER development cycle.You are not expected to be an instructional designer or media production expert to use this book. If you encounter a term with which you are unfamiliar, check the glossary at the end of the handbook for a definition.What this handbook does not coverOER is a broad topic and it would be difficult, if not impossible, to cover it comprehensively. This handbook does not include tutorials on the software used, though the URLs of some tutorials have been provided. It also does not prescribe a particular teaching method when using OER.If you are educational technology staff at an institution, the institution handbook may be more appropriate for you. There will also be a handbook available for policy-makers such as superintendents and higher education staff. See the Introduction to Other Handbooks in Conclusion for more information

    The FERRUM project: Transition probabilities for forbidden lines in [FeII] and experimental metastable lifetimes

    Full text link
    Accurate transition probabilities for forbidden lines are important diagnostic parameters for low-density astrophysical plasmas. In this paper we present experimental atomic data for forbidden [FeII] transitions that are observed as strong features in astrophysical spectra. Aims: To measure lifetimes for the 3d^6(^3G)4s a ^4G_{11/2} and 3d^6(^3D)4s b ^4D_{1/2} metastable levels in FeII and experimental transition probabilities for the forbidden transitions 3d^7 a ^4F_{7/2,9/2}- 3d^6(^3G)4s a ^4G_{11/2}. Methods: The lifetimes were measured at the ion storage ring facility CRYRING using a laser probing technique. Astrophysical branching fractions were obtained from spectra of Eta Carinae, obtained with the Space Telescope Imaging Spectrograph onboard the Hubble Space Telescope. The lifetimes and branching fractions were combined to yield absolute transition probabilities. Results: The lifetimes of the a ^4G_{11/2} and the b ^4D_{1/2} levels have been measured and have the following values, 0.75(10) s and 0.54(3) s respectively. Furthermore, we have determined the transition probabilities for two forbidden transitions of a ^4F_{7/2,9/2}- a ^4G_{11/2} at 4243.97 and 4346.85 A. Both the lifetimes and the transition probabilities are compared to calculated values in the literature.Comment: 5 pages, accepted for publication in A&

    A conceptual method for data integration in business analytics

    Get PDF
    Viele Unternehmen funktionieren derzeit in einem schnellen, dynamischen und vor allem unbeständigen Umfeld und wettbewerbsintensiven Markt. Daraus folgt, dass schnelle und faktenbasierende Entscheidungen ein wichtiger Erfolgsfaktor sein können. Basis für solche Entscheidungen sind oft Informationen aus Business Intelligence und Business Analytics. Eine der Herausforderungen bei der Schaffung von hochqualitativer Information für Geschäftsentscheidungen ist die Konsolidierung der Daten, die häufig aus mehrfachen heterogenen Systemen innerhalb eines Unternehmens oder in ein oder mehreren Standorten verteilt sind. ETL-Prozesse (Extraction, Transforming and Loading) sind häufig im Einsatz, um heterogene Daten aus einem oder mehreren Datenquellen in einem Zielsystem zusammenzuführen mit dem Ziel Data Marts oder Date Warehouse zu erstellen. Aufgrund mangelnder allgemeiner Methoden oder Ansätze, um systematisch solche ETL-Prozesse zu bewältigen, und Aufgrund der hohen Komplexität der Integration von Daten aus multiplen Quellen in einer allgemeinen, vereinheitlichten Darstellung, ist es sowohl für Fachleute als auch für die wenige erfahrene Anwender schwierig, Daten erfolgreich zu konsolidieren. Derzeit wird der analytische Prozess oft ohne vordefiniertes Rahmenwerk durchgeführt und basiert eher auf informelles Wissen als auf eine wissenschaftliche Methodik. Das größte Problem mit kommerzieller Software, die den Datenintegrationsprozess inklusive Visualisierung, Wiederverwendung von analytischen Sequenzen und automatischer Übersetzung der visuellen Beschreibung in einem ausführbaren Code unterstützt, ist, dass Metadaten für die Datenintegration generell nur syntaktisches Wissen darstellt. Semantische Informationen über die Datenstruktur sind typsicherweise nur in rudimentärer Form vorhanden und das obwohl sie eine signifikante Rolle bei der Definition des analytischen Modells und der Evaluierung des Ergebnisse spielen. Vor diesem Hintergrund hat Grossmann das “Conceptual Approach for Data Integration for Business Analytics” formuliert. Es zielt darauf hin, die Komplexität der analytischen Prozesse zu reduzieren und Fachkräfte in ihrer Arbeit zu unterstützen, um somit auch den Prozess für weniger erfahrene Anwender in unterschiedlichen Domänen zugänglich zu machen. Das Konzept ist detailliertes Wissen über Daten in Business Analytics, speziell Information über Semantik, zu berücksichtigen. Der Fokus liegt auf die Einbeziehung der strukturierten Beschreibung der Transformationsprozesse im Business Analytics, wo Informationen über Abhängigkeiten und Nebeneffekte von Algorithmen auch inkludiert sind. Darüber hinaus bezieht dieser Ansatz das Meta-Modell Konzept mit ein: es präsentiert ein Rahmenwerk mit Modellierungskonzepte für Datenintegration für Business Analytics. Basierend auf Grossmans Ansatz ist das Ziel dieser Masterarbeit die Entwicklung eines Meta-Model Prototyps, der die Datenintegration für Business Analytics unterstütz. Der Fokus liegt auf dem intellektuellen Prozess der Umwandlung einer theoretischen Methode in einem konzeptuellen Model, das auf ein Rahmenwerk von Modellierungsmethoden angewendet werden kann und welches zu den spezifischen Konzepten für eine bestimmte angewandte Meta-Model Plattform passt. Das Ergebnis ist ein Prototyp, der auf einer generischen konzeptuellen Methode basiert, welche unabhängig von der Ausführbarkeit einer Plattform ist. Darüber hinaus gibt es keine vordefinierte Granularitätsebene und die Modellobjekte sind für die unterschiedlichen Phasen der Datenintegration Prozess wiederverwendbar. Der Prototyp wurde auf der Open Model Plattform eingesetzt. Die Open Model Plattform ist eine Initiative der Universität Wien mit dem Ziel die Verwendung von Modellierungsmethoden zu erweitern und diese durch das Rahmenwerk, welches alle mögliche Modellierungsaktivitäten beinhaltet, für Geschäftsdomäne zur Verfügung zu stellen und nützlich zu machen, um die Zugänglichkeit bei dein Anwendern zu steigern.Today many organizations are operating in dynamic and rapid changing environment and highly competitive markets. Consequently fast and accurate fact-based decisions can be an important success factor. The basis for such decisions is usually business information as a result of business intelligence and business analytics in the corporate associations. One of the challenges of creating high-quality information for business decision is to consolidate the collected data that is spread in multiple heterogeneous systems throughout the organization in one or many different locations. Typically ETL-processes (Extraction, Transforming and Loading) are used to merge heterogeneous data from one or more data sources into a target system to form data repositories, data marts, or data warehouses. Due to the lack of a common methods or approaches to systematically manage such ETL processes and the high complexity of the task of integrating data from multiple sources to one common and unified view, it is difficult for both professionals and less experienced users to successfully consolidate data. Currently the analysis process is often performed without any predefined framework and is rather based on informal basis than a scientific methodology. Hence, for commercial tools that are supporting the data integration process including visualization of the integration, the reuse of analyses sequences and the automatic translation of the visual description to executable code, the major problem is that metadata used for data integration in general is only employed for representation of syntactic knowledge. Semantic information about the data structure is typically only available in a rudimentary form though it plays a significant role in defining the analysis model and the evaluation of the results. With this background Grossmann developed a “Conceptual Approach for Data Integration for Business Analytics”. It aims to support professionals by making business analytics easier and consequently more applicable to less experienced user in different domains. The idea is to incorporate detailed knowledge about the data in business analytics, especially information about semantics. It focuses on the inclusion of a more structured description of the transformation process in business analytics in which information about dependencies and side effects of the algorithms are included. Furthermore the approach incorporates the concept of meta-modelling; it presents a framework including the modelling concepts for data integration for business analytics. The idea of the thesis at hand is to develop a meta-model prototype that supports Data Integration for Business Analytics based on Grossman’s approach. The paper focuses on the intellectual process of transforming the theoretical method into a conceptual model which can be applied to the framework of a modelling methods and which fits to the specific concepts of a meta-model platform used. The result is a prototype based on a generic conceptual method which is execution platform independent, there are no pre-defined granularity levels and the objects of the model are re-usable for the different phases of the data integration process. The prototype is deployed on the Open Model Platform, an initiative started at the University of Vienna that aims to extend the usage of modelling methods and models and to make it more accessible to users by offering a framework including all kinds of modelling activities useful for business applications

    The FERRUM project: laboratory-measured transition probabilities for Cr II

    Full text link
    Aims: We measure transition probabilities for Cr II transitions from the z ^4H_J, z ^2D_J, y ^4F_J, and y ^4G_J levels in the energy range 63000 to 68000 cm^{-1}. Methods: Radiative lifetimes were measured using time-resolved laser-induced fluorescence from a laser-produced plasma. In addition, branching fractions were determined from intensity-calibrated spectra recorded with a UV Fourier transform spectrometer. The branching fractions and radiative lifetimes were combined to yield accurate transition probabilities and oscillator strengths. Results: We present laboratory measured transition probabilities for 145 Cr II lines and radiative lifetimes for 14 Cr II levels. The laboratory-measured transition probabilities are compared to the values from semi-empirical calculations and laboratory measurements in the literature.Comment: 13 pages. Accepted for publication in A&

    Ammonia Measurements and Emissions from a California Dairy Using Point and Remote Sensors

    Get PDF
    Ammonia (NH3) is an important trace gas species in the atmosphere that can have negative impacts on human, animal, and ecosystem health. Agriculture has been identified as the largest source of NH3, specifically livestock operations. NH3 emissions from a commercial dairy in California were investigated during June 2008. Cattle were held in open-lot pens, except for young calves in hutches with shelters. Solid manure was stored in the open-lot pens. Liquid manure from feed lanes was passed through a solids settling basin and stored in a holding pond. Passive sensors and openpath Fourier transform infrared spectrometers (OP-FTIR) were deployed around the facility to measure NH3 concentrations. Emissions from pens and the liquid manure system (LMS) were estimated using inverse modeling. Mean emission factors (EFs) for the entire facility were 140.5 ±42.5 g d-1 animal-1 from the passive sampler data and 199.2 ±22.0 g d-1 animal-1 from the OP-FTIR data, resulting in the facility’s summer emissions calculated at 265.2 ±80.2 kg d-1 and 375.4 ±27.1 kg d-1, respectively. These EFs are within the range of values reported in the literature. Both concentrations and emissions exhibited a strong diurnal cycle, peaking in the late afternoon. Total facility emissions exhibited significant positive correlations with temperature and wind speed. The findings of this study show that NH3 emissions from a commercial dairy can vary by a factor of 10 or more throughout the day, and EFs can vary by two orders of magnitude when compared to other U.S. dairies, based on literature values

    Electrophilic PPARγ Ligands Attenuate IL-1β and Silica-Induced Inflammatory Mediator Production in Human Lung Fibroblasts via a PPARγ-Independent Mechanism

    Get PDF
    Acute and chronic lung inflammation is associated with numerous important disease pathologies including asthma, chronic obstructive pulmonary disease and silicosis. Lung fibroblasts are a novel and important target of anti-inflammatory therapy, as they orchestrate, respond to, and amplify inflammatory cascades and are the key cell in the pathogenesis of lung fibrosis. Peroxisome proliferator-activated receptor gamma (PPARγ) ligands are small molecules that induce anti-inflammatory responses in a variety of tissues. Here, we report for the first time that PPARγ ligands have potent anti-inflammatory effects on human lung fibroblasts. 2-cyano-3, 12-dioxoolean-1, 9-dien-28-oic acid (CDDO) and 15-deoxy-Δ12,14-prostaglandin J2 (15d-PGJ2) inhibit production of the inflammatory mediators interleukin-6 (IL-6), monocyte chemoattractant protein-1 (MCP-1), COX-2, and prostaglandin (PG)E2 in primary human lung fibroblasts stimulated with either IL-1β or silica. The anti-inflammatory properties of these molecules are not blocked by the PPARγ antagonist GW9662 and thus are largely PPARγ independent. However, they are dependent on the presence of an electrophilic carbon. CDDO and 15d-PGJ2, but not rosiglitazone, inhibited NF-κB activity. These results demonstrate that CDDO and 15d-PGJ2 are potent attenuators of proinflammatory responses in lung fibroblasts and suggest that these molecules should be explored as the basis for novel, targeted anti-inflammatory therapies in the lung and other organs

    Astronomy & Astrophysics The FERRUM project: laboratory-measured transition probabilities for Cr II

    Get PDF
    ABSTRACT Aims. We measure transition probabilities for Cr ii transitions from the z 4 H J , z 2 D J , y 4 F J , and y 4 G J levels in the energy range 63 000 to 68 000 cm −1 . Methods. Radiative lifetimes were measured using time-resolved laser-induced fluorescence from a laser-produced plasma. In addition, branching fractions were determined from intensity-calibrated spectra recorded with a UV Fourier transform spectrometer. The branching fractions and radiative lifetimes were combined to yield accurate transition probabilities and oscillator strengths. Results. We present laboratory measured transition probabilities for 145 Cr ii lines and radiative lifetimes for 14 Cr ii levels. The laboratory-measured transition probabilities are compared to the values from semi-empirical calculations and laboratory measurements in the literature

    The FERRUM Project: experimental and theoretical transition rates of forbidden [Sc II] lines and radiative lifetimes of metastable Sc II levels

    Full text link
    Context. In many plasmas, long-lived metastable atomic levels are depopulated by collisions (quenched) before they decay radiatively. In low-density regions, however, the low collision rate may allow depopulation by electric dipole (E1) forbidden radiative transitions, so-called forbidden lines (mainly M1 and E2 transitions). If the atomic transition data are known, these lines are indicators of physical plasma conditions and used for abundance determination. Aims. Transition rates can be derived by combining relative intensities between the decay channels, so-called branching fractions (BFs), and the radiative lifetime of the common upper level. We use this approach for forbidden [Sc ii] lines, along with new calculations. Methods. Neither BFs for forbidden lines, nor lifetimes of metastable levels, are easily measured in a laboratory. Therefore, astrophysical BFs measured in Space Telescope Imaging Spectrograph (STIS) spectra of the strontium filament of Eta Carinae are combined with lifetime measurements using a laser probing technique on a stored ion-beam (CRYRING facility,MSL, Stockholm). These quantities are used to derive the absolute transition rates (A-values). New theoretical transition rates and lifetimes are calulated using the CIV3 code. Results. We report experimental lifetimes of the Sc ii levels 3d2 a3P0,1,2 with lifetimes 1.28, 1.42, and 1.24 s, respectively, and transition rates for lines from these levels down to 3d4s a3D in the region 8270-8390 A. These are the most important forbidden [Sc ii] transitions. New calculations for lines and metastable lifetimes are also presented, and are in good agreement with the experimental data.Comment: 5 pages. Accepted for A&

    The accuracy of stellar atmospheric parameter determinations: a case study with HD 32115 and HD 37594

    Full text link
    We present detailed parameter determinations of two chemically normal late A-type stars, HD 32115 and HD 37594, to uncover the reasons behind large discrepancies between two previous analyses of these stars performed with a semi-automatic procedure and a "classical" analysis. Our study is based on high resolution, high signal-to-noise spectra obtained at the McDonald Observatory. Our method is based on the simultaneous use of all available observables: multicolor photometry, pressure-sensitive magnesium lines, metallic lines and Balmer line profiles. Our final set of fundamental parameters fits, within the error bars, all available observables. It differs from the published results obtained with a semi-automatic procedure. A direct comparison between our new observational material and the spectra previously used by other authors shows that the quality of the data is not the origin of the discrepancies. As the two stars require a substantial macroturbulence velocity to fit the line profiles, we concluded that neglecting this additional broadening in the semi-automatic analysis is one origin of discrepancy. The use of FeI excitation equilibrium and of the Fe ionisation equilibrium, to derive effective temperature and surface gravity, respectively, neglecting all other indicators leads to a systematically erroneously high effective temperature. We deduce that the results obtained using only one parameter indicator might be biased and that those results need to be cautiously taken when performing further detailed analyses, such as modelling of the asteroseismic frequencies or characterising transiting exoplanets.Comment: Accepted for publication by MNRA
    corecore