1,085 research outputs found
Evaluating Differences between Psychedelic Drug-Users and Non-Psychedelic Drug Users among College Students
There has been a recent surge in scientific publications documenting the therapeutic applications of psychedelic drugs (e.g., LSD, psilocybin). Emerging research has demonstrated the potential for medicinal use of psychedelic drugs to mitigate psychiatric concerns, including depression, anxiety, and addiction. While research into psychedelics suggests promise for these atypical substances to address psychiatric concerns, additional research is needed to focus on the associated consequences of recreational psychedelic use (i.e., use of psychedelics for fun or to get “high”) engaged in outside the observation and guidance of a trained mental health practitioner. The present proposal used an archival dataset collected from a sample of undergraduate students that completed an assessment battery evaluating drug-use and mental health variables. In the current project, the study focus was restricted to student participants that endorsed recreational use of any kind (n=711). Of the stratified sample, participants that endorsed past year of psychedelic drug use (PSY; n=38) were compared to recreational drug-alcohol users not endorsing prior psychedelic drug-use (REC; n=673). No statistically significant differences were observed between PSY and REC groups on age, GPA or stress. However, the PSY group evidenced statistically significant higher levels of alcohol use, cannabis use, depression and anxiety (p \u3c .05). Although findings are relational and do not suggest causation, they suggest potentially greater mental health conflicts among individuals reporting recreational psychedelic drug use. Follow-up analyses will further stratify the sample compare none-PSY drug users to PSY users on variables of interest and will included expanded discussion of study implications
Maximizing the benefits and minimizing the risks of intervention programs to address micronutrient malnutrition: symposium report.
Interventions to address micronutrient deficiencies have large potential to reduce the related disease and economic burden. However, the potential risks of excessive micronutrient intakes are often not well determined. During the Global Summit on Food Fortification, 9-11 September 2015, in Arusha, a symposium was organized on micronutrient risk-benefit assessments. Using case studies on folic acid, iodine and vitamin A, the presenters discussed how to maximize the benefits and minimize the risks of intervention programs to address micronutrient malnutrition. Pre-implementation assessment of dietary intake, and/or biomarkers of micronutrient exposure, status and morbidity/mortality is critical in identifying the population segments at risk of inadequate and excessive intake. Dietary intake models allow to predict the effect of micronutrient interventions and their combinations, e.g. fortified food and supplements, on the proportion of the population with intakes below adequate and above safe thresholds. Continuous monitoring of micronutrient intake and biomarkers is critical to identify whether the target population is actually reached, whether subgroups receive excessive amounts, and inform program adjustments. However, the relation between regular high intake and adverse health consequences is neither well understood for many micronutrients, nor do biomarkers exist that can detect them. More accurate and reliable biomarkers predictive of micronutrient exposure, status and function are needed to ensure effective and safe intake ranges for vulnerable population groups such as young children and pregnant women. Modelling tools that integrate information on program coverage, dietary intake distribution and biomarkers will further enable program makers to design effective, efficient and safe programs
Compensating asynchrony effects in the calculation of financial correlations
We present a method to compensate statistical errors in the calculation of
correlations on asynchronous time series. The method is based on the assumption
of an underlying time series. We set up a model and apply it to financial data
to examine the decrease of calculated correlations towards smaller return
intervals (Epps effect). We show that this statistical effect is a major cause
of the Epps effect. Hence, we are able to quantify and to compensate it using
only trading prices and trading times.Comment: 13 pages, 7 figure
Liquidity, volatility, and flights to safety in the US treasury market: Evidence from a new class of dynamic order book models
We propose a new class of dynamic order book models that allow us to 1) study episodes of extreme low liquidity and 2) unite liquidity and volatility in one framework through which their joint dynamics can be examined. Liquidity and volatility in the U.S. Treasury securities market are analyzed around the time of economic announcements, throughout the recent financial crisis, and during flight-to-safety episodes. We document that Treasury market depth declines sharply during the crisis, accompanied by increased price volatility, but that trading activity seems unaffected until after the Lehman Brothers bankruptcy. Our models' key finding is that price volatility and depth at the best bid and ask prices exhibit a negative feedback relationship and that each becomes more persistent during the crisis. Lastly, we characterize the Treasury market during flights to safety as having much lower market depth, along with higher trading volume and greater price uncertainty
Modeling Quantum Optical Components, Pulses and Fiber Channels Using OMNeT++
Quantum Key Distribution (QKD) is an innovative technology which exploits the
laws of quantum mechanics to generate and distribute unconditionally secure
cryptographic keys. While QKD offers the promise of unconditionally secure key
distribution, real world systems are built from non-ideal components which
necessitates the need to model and understand the impact these non-idealities
have on system performance and security. OMNeT++ has been used as a basis to
develop a simulation framework to support this endeavor. This framework,
referred to as "qkdX" extends OMNeT++'s module and message abstractions to
efficiently model optical components, optical pulses, operating protocols and
processes. This paper presents the design of this framework including how
OMNeT++'s abstractions have been utilized to model quantum optical components,
optical pulses, fiber and free space channels. Furthermore, from our toolbox of
created components, we present various notional and real QKD systems, which
have been studied and analyzed.Comment: Published in: A. F\"orster, C. Minkenberg, G. R. Herrera, M. Kirsche
(Eds.), Proc. of the 2nd OMNeT++ Community Summit, IBM Research - Zurich,
Switzerland, September 3-4, 201
Task Force on the StaffWeb: Final Report
On 30 March 1999, a StaffWeb Committee was created to make recommendations about the future development, content, and maintenance of the StaffWeb. This is a final report
Large dynamic covariance matrices: Enhancements based on intraday data
Multivariate GARCH models do not perform well in large dimensions due to the so-called curse of dimensionality. The recent DCC-NL model of Engle et al. (2019) is able to overcome this curse via nonlinear shrinkage estimation of the unconditional correlation matrix. In this paper, we show how performance can be increased further by using open/high/low/close (OHLC) price data instead of simply using daily returns. A key innovation, for the improved modeling of not only dynamic variances but also of dynamic correlations, is the concept of a regularized return, obtained from a volatility proxy in conjunction with a smoothed sign of the observed return
- …