1,228 research outputs found

    Portfolio-Management für Privatanleger auf Basis des State Preference Ansatzes

    Get PDF
    Im Rahmen der bestehenden Portfoliotheorie wird zur Risikobewertung auf Normalverteilungsannahmen der Renditen oder Korrelationen aus historischen Daten zurückgegriffen. In den Finanzkrisen der Jahre 2008/09 stiegen jedoch die Korrelationen zwischen risikobehafteten Kapitalanlagen stark an. Zugleich wiesen deren Renditen Ausreißer im negativen Bereich auf, für die eine Normalverteilungsannahme keinen Erklärungsgehalt birgt. Deshalb wird in dieser Arbeit unter Anwendung des State Preference Ansatzes eine Möglichkeit zur impliziten Ermittlung der Wahrscheinlichkeitsannahmen und der Risikoeinstellung des Kapitalmarktes vorgestellt. Hierzu wird eine quadratische Payoff Matrix aus den Marktpreisen der Kapitalanlagen im Januar 2011 und deren Rückflüssen in einem spezifizierten zukünftigen Zeitpunkt erstellt. Die Rückflüsse werden über einen multivariaten Regressionsansatz für fest definierte makroökonomische Umweltzustände prognostiziert. Es zeigt sich, dass die Zustandpreisverteilung des Kapitalmarktes nach dem Prinzip arbitragefreier Märkte als Näherungslösung ermittelt werden kann und die Risikoeinstellung des Kapitalmarktes aufzeigt. Durch die Adjustierung der Rückflüsse mit dem risikolosen Zinssatz und dem kapitalanlagespezifischen Risikoaufschlag können die Zustandspreise beispielhaft als wahre Wahrscheinlichkeiten des Kapitalmarktes in das Modell des Minimum-Varianz-Portfolios übertragen und unter festgelegten Annahmen zur Darstellung und Optimierung von Portfolios verwendet werden. -- In context of the existing Portfolio Theory the valuation of risk is based on the normal distribution of return or correlation based on historical data. During the Financial Crisis in 2008/09 the correlation between assets that carried risks increased. In addition the return of those assets were partly negative even though the assumption of Gaussian distribution offered no explanation. By identifying this problem, this working paper offers a possibility to use the implicit probabilities and the risk assessment of the capital markets by using the State Preference Theory. Therefore a squared Payoff Matrix is created by the market prices of chosen assets in January 2011 and their returns in point in time t1. The returns are forecasted using a multivariable regression which applies for exactly defined macro-economic conditions. It is shown, that the state prices of the capital markets can be determined as approximate value that shows the risk accommodation using the principle of arbitrage free markets. By discounting the returns with the risk free rate and the asset specific risk premium the state prices can be shown for example as true probabilities of the capital markets. These probabilities can be transferred into the minimum-variance portfolio which can be used to optimize Portfolios by using specific presumptions.

    Designing and Testing an Inventory for Measuring Social Media Competency of Certified Health Education Specialists

    Get PDF
    Objective: The aim of this study was to design, develop, and test the Social Media Competency Inventory (SMCI) for CHES and MCHES. Methods: The SMCI was designed in three sequential phases: (1) Conceptualization and Domain Specifications, (2) Item Development, and (3) Inventory Testing and Finalization. Phase 1 consisted of a literature review, concept operationalization, and expert reviews. Phase 2 involved an expert panel (n=4) review, think-aloud sessions with a small representative sample of CHES/MCHES (n=10), a pilot test (n=36), and classical test theory analyses to develop the initial version of the SMCI. Phase 3 included a field test of the SMCI with a random sample of CHES and MCHES (n=353), factor and Rasch analyses, and development of SMCI administration and interpretation guidelines. Results: Six constructs adapted from the unified theory of acceptance and use of technology and the integrated behavioral model were identified for assessing social media competency: (1) Social Media Self-Efficacy, (2) Social Media Experience, (3) Effort Expectancy, (4) Performance Expectancy, (5) Facilitating Conditions, and (6) Social Influence. The initial item pool included 148 items. After the pilot test, 16 items were removed or revised because of low item discrimination (r.90), or based on feedback received from pilot participants. During the psychometric analysis of the field test data, 52 items were removed due to low discrimination, evidence of content redundancy, low R-squared value, or poor item infit or outfit. Psychometric analyses of the data revealed acceptable reliability evidence for the following scales: Social Media Self-Efficacy (alpha=.98, item reliability=.98, item separation=6.76), Social Media Experience (alpha=.98, item reliability=.98, item separation=6.24), Effort Expectancy(alpha =.74, item reliability=.95, item separation=4.15), Performance Expectancy (alpha =.81, item reliability=.99, item separation=10.09), Facilitating Conditions (alpha =.66, item reliability=.99, item separation=16.04), and Social Influence (alpha =.66, item reliability=.93, item separation=3.77). There was some evidence of local dependence among the scales, with several observed residual correlations above |.20|. Conclusions: Through the multistage instrument-development process, sufficient reliability and validity evidence was collected in support of the purpose and intended use of the SMCI. The SMCI can be used to assess the readiness of health education specialists to effectively use social media for health promotion research and practice. Future research should explore associations across constructs within the SMCI and evaluate the ability of SMCI scores to predict social media use and performance among CHES and MCHES

    LHC collimation efficiency during commissioning

    Get PDF
    The design of the LHC collimation system requires understanding and maximizing the ultimate performance with all collimators. However, for the commissioning of the LHC it is important to analyze the collimation efficiency with certain subsets of collimators, with increased collimator gaps and relaxed set-up tolerances. Special studies on halo tracking and energy deposition have been performed in order to address this question. The expected cleaning performance and intensity limits are discussed for various collimation scenarios which might be used during commissioning of the LHC

    An Early & Comprehensive Millimeter and Centimeter Wave and X-ray Study of Supernova 2011dh: A Non-Equipartition Blastwave Expanding into A Massive Stellar Wind

    Get PDF
    Only a handful of supernovae (SNe) have been studied in multi-wavelength from radio to X-rays, starting a few days after explosion. The early detection and classification of the nearby type IIb SN2011dh/PTF11eon in M51 provides a unique opportunity to conduct such observations. We present detailed data obtained at the youngest phase ever of a core-collapse supernova (days 3 to 12 after explosion) in the radio, millimeter and X-rays; when combined with optical data, this allows us to explore the early evolution of the SN blast wave and its surroundings. Our analysis shows that the expanding supernova shockwave does not exhibit equipartition (e_e/e_B ~ 1000), and is expanding into circumstellar material that is consistent with a density profile falling like R^-2. Within modeling uncertainties we find an average velocity of the fast parts of the ejecta of 15,000 +/- 1800 km/s, contrary to previous analysis. This velocity places SN 2011dh in an intermediate blast-wave regime between the previously defined compact and extended SN IIb subtypes. Our results highlight the importance of early (~ 1 day) high-frequency observations of future events. Moreover, we show the importance of combined radio/X-ray observations for determining the microphysics ratio e_e/e_B.Comment: 9 pages, 5 figures, submitted to Ap

    High Angular Resolution Imaging of Solar Radio Bursts from the Lunar Surface

    Get PDF
    Locating low frequency radio observatories on the lunar surface has a number of advantages, including positional stability and a very low ionospheric radio cutoff. Here, we describe the Radio Observatory on the lunar Surface for Solar studies (ROLSS), a concept for a low frequency, radio imaging interferometric array designed to study particle acceleration in the corona and inner heliosphere. ROLSS would be deployed during an early lunar sortie or by a robotic rover as part of an unmanned landing. The preferred site is on the lunar near side to simplify the data downlink to Earth. The prime science mission is to image type II and type III solar radio bursts with the aim of determining the sites at and mechanisms by which the radiating particles are accelerated. Secondary science goals include constraining the density of the lunar ionosphere by measuring the low radio frequency cutoff of the solar radio emissions or background galactic radio emission, measuring the flux, particle mass, and arrival direction of interplanetary and interstellar dust, and constraining the low energy electron population in astrophysical sources. Furthermore, ROLSS serves a pathfinder function for larger lunar radio arrays. Key design requirements on ROLSS include the operational frequency and angular resolution. The electron densities in the solar corona and inner heliosphere are such that the relevant emission occurs below 10 M Hz, essentially unobservable from Earth's surface due to the terrestrial ionospheric cutoff. Resolving the potential sites of particle acceleration requires an instrument with an angular resolution of at least 2 deg at 10 MHz, equivalent to a linear array size of approximately one kilometer. The major components of the ROLSS array are 3 antenna arms, each of 500 m length, arranged in a Y formation, with a central electronics package (CEP) at their intersection. Each antenna arm is a linear strip of polyimide film (e.g., Kapton(TradeMark)) on which 16 single polarization dipole antennas are located by depositing a conductor (e.g., silver). The arms also contain transmission lines for carrying the radio signals from the science antennas to the CEP. Operations would consist of data acquisition during the lunar day, with data downlinks to Earth one or more times every 24 hours

    Virtual karyotyping with SNP microarrays reduces uncertainty in the diagnosis of renal epithelial tumors

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Renal epithelial tumors are morphologically, biologically, and clinically heterogeneous. Different morphologic subtypes require specific management due to markedly different prognosis and response to therapy. Each common subtype has characteristic chromosomal gains and losses, including some with prognostic value. However, copy number information has not been readily accessible for clinical purposes and thus has not been routinely used in the diagnostic evaluation of these tumors. This information can be useful for classification of tumors with complex or challenging morphology. 'Virtual karyotypes' generated using SNP arrays can readily detect characteristic chromosomal lesions in paraffin embedded renal tumors and can be used to correctly categorize the common subtypes with performance characteristics that are amenable for routine clinical use.</p> <p>Methods</p> <p>To investigate the use of virtual karyotypes for diagnostically challenging renal epithelial tumors, we evaluated 25 archived renal neoplasms where sub-classification could not be definitively rendered based on morphology and other ancillary studies. We generated virtual karyotypes with the Affymetrix 10 K 2.0 mapping array platform and identified the presence of genomic lesions across all 22 autosomes.</p> <p>Results</p> <p>In 91% of challenging cases the virtual karyotype unambiguously detected the presence or absence of chromosomal aberrations characteristic of one of the common subtypes of renal epithelial tumors, while immunohistochemistry and fluorescent in situ hybridization had no or limited utility in the diagnosis of these tumors.</p> <p>Conclusion</p> <p>These results show that virtual karyotypes generated by SNP arrays can be used as a practical ancillary study for the classification of renal epithelial tumors with complex or ambiguous morphology.</p

    An Early and Comprehensive Millimetre and Centimetre Wave and X-ray Study of SN 2011dh: a Non-Equipartition Blast Wave Expanding into a Massive Stellar Wind

    Get PDF
    Only a handful of supernovae (SNe) have been studied in multiwavelengths from the radio to X-rays, starting a few days after the explosion. The early detection and classification of the nearby Type IIb SN 2011dh/PTF 11eon in M51 provides a unique opportunity to conduct such observations. We present detailed data obtained at one of the youngest phase ever of a core-collapse SN (days 3–12 after the explosion) in the radio, millimetre and X-rays; when combined with optical data, this allows us to explore the early evolution of the SN blast wave and its surroundings. Our analysis shows that the expanding SN shock wave does not exhibit equipartition (ϵe/ϵB ∼ 1000), and is expanding into circumstellar material that is consistent with a density profile falling like R−2. Within modelling uncertainties we find an average velocity of the fast parts of the ejecta of 15 000 ± 1800 km s−1, contrary to previous analysis. This velocity places SN 2011dh in an intermediate blast wave regime between the previously defined compact and extended SN Type IIb subtypes. Our results highlight the importance of early (∼1 d) high-frequency observations of future events. Moreover, we show the importance of combined radio/X-ray observations for determining the microphysics ratio ϵe/ϵB

    FORTE satellite constraints on ultra-high energy cosmic particle fluxes

    Full text link
    The FORTE (Fast On-orbit Recording of Transient Events) satellite records bursts of electromagnetic waves arising from near the Earth's surface in the radio frequency (RF) range of 30 to 300 MHz with a dual polarization antenna. We investigate the possible RF signature of ultra-high energy cosmic-ray particles in the form of coherent Cherenkov radiation from cascades in ice. We calculate the sensitivity of the FORTE satellite to ultra-high energy (UHE) neutrino fluxes at different energies beyond the Greisen-Zatsepin-Kuzmin (GZK) cutoff. Some constraints on supersymmetry model parameters are also estimated due to the limits that FORTE sets on the UHE neutralino flux. The FORTE database consists of over 4 million recorded events to date, including in principle some events associated with UHE neutrinos. We search for candidate FORTE events in the period from September 1997 to December 1999. The candidate production mechanism is via coherent VHF radiation from a UHE neutrino shower in the Greenland ice sheet. We demonstrate a high efficiency for selection against lightning and anthropogenic backgrounds. A single candidate out of several thousand raw triggers survives all cuts, and we set limits on the corresponding particle fluxes assuming this event represents our background level.Comment: added a table, updated references and Figure 8, this version is submitted to Phys. Rev.

    A novel SNP analysis method to detect copy number alterations with an unbiased reference signal directly from tumor samples

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Genomic instability in cancer leads to abnormal genome copy number alterations (CNA) as a mechanism underlying tumorigenesis. Using microarrays and other technologies, tumor CNA are detected by comparing tumor sample CN to normal reference sample CN. While advances in microarray technology have improved detection of copy number alterations, the increase in the number of measured signals, noise from array probes, variations in signal-to-noise ratio across batches and disparity across laboratories leads to significant limitations for the accurate identification of CNA regions when comparing tumor and normal samples.</p> <p>Methods</p> <p>To address these limitations, we designed a novel "Virtual Normal" algorithm (VN), which allowed for construction of an unbiased reference signal directly from test samples within an experiment using any publicly available normal reference set as a baseline thus eliminating the need for an in-lab normal reference set.</p> <p>Results</p> <p>The algorithm was tested using an optimal, paired tumor/normal data set as well as previously uncharacterized pediatric malignant gliomas for which a normal reference set was not available. Using Affymetrix 250K Sty microarrays, we demonstrated improved signal-to-noise ratio and detected significant copy number alterations using the VN algorithm that were validated by independent PCR analysis of the target CNA regions.</p> <p>Conclusions</p> <p>We developed and validated an algorithm to provide a virtual normal reference signal directly from tumor samples and minimize noise in the derivation of the raw CN signal. The algorithm reduces the variability of assays performed across different reagent and array batches, methods of sample preservation, multiple personnel, and among different laboratories. This approach may be valuable when matched normal samples are unavailable or the paired normal specimens have been subjected to variations in methods of preservation.</p
    corecore