174 research outputs found

    The Effect of Reward Provision Timing in Mobile Application Platforms: A Social Exchange Theory Perspective

    Get PDF
    With the growing size of the food delivery mobile application market, reviews of restaurants are becoming more significant. As part of their marketing strategy, restaurants listed in Korean food delivery mobile applications such as Baemin and Yogiyo have come up with the Advance Review Reward Promotion (ARRP) in which rewards are given out before writing a review. Despite the perception of great loss accompanied by giving out rewards with uncertain promises from consumers, more and more restaurants are explosively expanding their ARRP, and restaurants not offering such reward promotions are considered rare. Based on extant literature, we hypothesized that the Traditional Review Reward Promotion (TRRP) in which rewards are given out after writing restaurant reviews and ARRP differ in terms of the quantity of reviews, the deviation of the quality of verbal information in reviews, and the quantity of reviews included visual information according to the timing of reward provision

    Functional clustering methods for binary longitudinal data with temporal heterogeneity

    Full text link
    In the analysis of binary longitudinal data, it is of interest to model a dynamic relationship between a response and covariates as a function of time, while also investigating similar patterns of time-dependent interactions. We present a novel generalized varying-coefficient model that accounts for within-subject variability and simultaneously clusters varying-coefficient functions, without restricting the number of clusters nor overfitting the data. In the analysis of a heterogeneous series of binary data, the model extracts population-level fixed effects, cluster-level varying effects, and subject-level random effects. Various simulation studies show the validity and utility of the proposed method to correctly specify cluster-specific varying-coefficients when the number of clusters is unknown. The proposed method is applied to a heterogeneous series of binary data in the German Socioeconomic Panel (GSOEP) study, where we identify three major clusters demonstrating the different varying effects of socioeconomic predictors as a function of age on the working status

    Off-Policy Reinforcement Learning with Loss Function Weighted by Temporal Difference Error

    Full text link
    Training agents via off-policy deep reinforcement learning (RL) requires a large memory, named replay memory, that stores past experiences used for learning. These experiences are sampled, uniformly or non-uniformly, to create the batches used for training. When calculating the loss function, off-policy algorithms assume that all samples are of the same importance. In this paper, we hypothesize that training can be enhanced by assigning different importance for each experience based on their temporal-difference (TD) error directly in the training objective. We propose a novel method that introduces a weighting factor for each experience when calculating the loss function at the learning stage. In addition to improving convergence speed when used with uniform sampling, the method can be combined with prioritization methods for non-uniform sampling. Combining the proposed method with prioritization methods improves sampling efficiency while increasing the performance of TD-based off-policy RL algorithms. The effectiveness of the proposed method is demonstrated by experiments in six environments of the OpenAI Gym suite. The experimental results demonstrate that the proposed method achieves a 33%~76% reduction of convergence speed in three environments and an 11% increase in returns and a 3%~10% increase in success rate for other three environments.Comment: to be submitted to an AI conferenc

    Impact of Cognitive Workload on Physiological Arousal and Performance in Younger and Older Drivers

    Get PDF
    Two groups, aged 25-35 and 60-69, engaged in 3 levels of a delayed auditory recall task while driving a simulated highway. Heart rate and skin conductance increased with each level of demand, demonstrating that these indices can correctly rank order cognitive workload. Effects were also observed on speed and SD of lane position, but they were subtle, nonlinear, and did not effectively differentiate. Patterns were quite consistent across age groups. These findings on the sensitivity of physiological measures replicate those from an onroad study using a similar protocol. Together, the results support the validity of using these physiological measures of workload in a simulated environment to model differences likely to be present under actual driving conditions

    Bayesian model averaging approach in health effects studies: Sensitivity analyses using PM10 and cardiopulmonary hospital admissions in Allegheny County, Pennsylvania and simulated data

    Get PDF
    AbstractGeneralized Additive Models (GAMs) with natural cubic splines (NS) as smoothing functions have become a standard analytical tool in time series studies of health effects of air pollution. However, standard model selection procedures ignore the model uncertainty that may lead to biased estimates, in particular those of the lagged effects. We addressed this issue by Bayesian model averaging (BMA) approach which accounts for model uncertainty by combining information from all possible models where GAMs and NS were used. Firstly, we conducted a sensitivity analysis with simulation studies for Bayesian model averaging with different calibrated hyperparameters contained in the posterior model probabilities. Our results indicated the importance of selecting the optimum degree of lagging for variables, based not only on maximizing the likelihood, but also by considering the possible effects of concurvity, consistency of degree of lagging, and biological plausibility. This was illustrated by analyses of the Allegheny County Air Pollution Study (ACAPS) where the quantity of interest was the relative risk of cardiopulmonary hospital admissions for a 20 μg/m3 increase in PM10 values for the current day. Results showed that the posterior means of the relative risk and 95% posterior probability intervals were close to each other under different choices of the prior distributions. Simulation results were consistent with these findings. It was also found that using lag variables in the model when there is only same day effect, may underestimate the relative risk attributed to the same day effect

    Long-term sac behavior after endovascular abdominal aortic aneurysm repair with the Excluder low-permeability endoprosthesis

    Get PDF
    PurposeSac regression is a surrogate marker for clinical success in endovascular aneurysm repair (EVAR) and has been shown to be device-specific. The low porosity Excluder endograft (Excluder low-permeability endoprosthesis [ELPE]; W. L. Gore & Associates Inc, Flagstaff, Ariz) introduced in 2004 was reported in early follow-up to be associated with sac regression rates similar to other endografts, unlike the original Excluder which suffered from sac growth secondary to fluid accumulation in the sac. The purpose of this study was to determine whether this behavior is durable in mid-term to long-term follow-up.MethodsBetween July 2004 and December 2007, 301 patients underwent EVAR of an abdominal aortic aneurysm (AAA) with the ELPE at two institutions. Baseline sac size was measured by computed tomography (CT) scan at 1 month after repair. Follow-up beyond 1 year was either with a CT or ultrasound scan. Changes in sac size ≥5 mm from baseline were determined to be significant. Endoleak history was assessed with respect to sac behavior using χ2 and logistic regression analysis.ResultsTwo hundred sixteen patients (mean age 73.6 years and 76% men) had at least 1-year follow-up imaging available for analysis. Mean follow-up was 2.6 years (range, 1-5 years). The average minor-axis diameter was 52 mm at baseline. The proportion of patients with sac regression was similar during the study period: 58%, 66%, 60%, 59%, and 63% at 1 to 5 years, respectively. The proportion of patients with sac growth increased over time to 14.8% at 4-year follow-up. The probability of freedom from sac growth at 4 years was 82.4%. Eighty patients (37.7%) had an endoleak detected at some time during follow-up with 29.6% (16 of 54) residual endoleak rate at 4 years; 13 of the residual 16 endoleaks were type II. All patients with sac growth had endoleaks at some time during the study compared with only 18% of patients with sac regression (P < .0001).ConclusionA sustained sac regression after AAA exclusion with ELPE is noted up to 5-year follow-up. Sac enlargement was observed only in the setting of a current or previous endoleak, with no cases of suspected hygroma formation noted

    Bayesian Estimation of Hardness Ratios: Modeling and Computations

    Get PDF
    A commonly used measure to summarize the nature of a photon spectrum is the so-called Hardness Ratio, which compares the number of counts observed in different passbands. The hardness ratio is especially useful to distinguish between and categorize weak sources as a proxy for detailed spectral fitting. However, in this regime classical methods of error propagation fail, and the estimates of spectral hardness become unreliable. Here we develop a rigorous statistical treatment of hardness ratios that properly deals with detected photons as independent Poisson random variables and correctly deals with the non-Gaussian nature of the error propagation. The method is Bayesian in nature, and thus can be generalized to carry out a multitude of source-population--based analyses. We verify our method with simulation studies, and compare it with the classical method. We apply this method to real world examples, such as the identification of candidate quiescent Low-mass X-ray binaries in globular clusters, and tracking the time evolution of a flare on a low-mass star.Comment: 43 pages, 10 figures, 3 tables; submitted to Ap

    Accounting for Calibration Uncertainties in X-ray Analysis: Effective Areas in Spectral Fitting

    Full text link
    While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can be applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.Comment: 61 pages double spaced, 8 figures, accepted for publication in Ap
    • …
    corecore