449 research outputs found

    Insurance demand experiments: Comparing crowdworking to the lab

    Get PDF
    We analyze an insurance demand experiment conducted in two different settings: in-person at a university laboratory and online using a crowdworking platform. Subject demographics differ across the samples, but average insurance demand is similar. However, choice patterns suggest online subjects are less cognitively engaged—they have more variation in their demand and react less to changes in exogenous factors of the insurance situation. Applying data quality filters does not lead to more comparable demand patterns between the samples. Additionally, while online subjects pass comprehension questions at the same rate as in-person subjects, they show more random behavior in other questions. We find that online subjects are more likely to engage in “coarse thinking,” choosing from a reduced set of options. Our results justify caution in using crowdsourced subjects for insurance demand experiments. We outline some best practices which may help improve data quality from experiments conducted via crowdworking platforms

    Collecting Samples From Online Services: How to Use Screeners to Improve Data Quality

    Get PDF
    Increasingly, marketing and consumer researchers rely on online data collection services. While actively-managed data collection services directly assist with the sampling process, minimally-managed data collection services, such as Amazon’s Mechanical Turk (MTurk), leave researchers solely responsible for recruiting, screening, cleaning, and evaluating responses. The research reported here proposes a 2 × 2 framework based on sampling goal and methodology for screening and evaluating the quality of online samples. By sampling goals, screeners can be categorized as selection, which involves matching the sample with the targeted population; or as accuracy, which involves ensuring that participants are appropriately attentive. By methodology, screeners can be categorized as direct, which screens individual responses; and as statistical, which provides quantitative signals of low quality. Multiple screeners for each of the four categories are compared across three MTurk samples, two actively-managed data collection samples (Qualtrics and Dynata), and a student sample. The results suggest the need for screening in every online sample, particularly for the MTurk samples, with the fewest supplier-provided filters. Recommendations are provided for researchers and journal reviewers that provide greater transparency with respect to sample practices

    A Brief Exposition on Brain-Computer Interface

    Get PDF
    Brain-Computer Interface is a technology that records brain signals and translates them into useful commands to operate a drone or a wheelchair. Drones are used in various applications such as aerial operations, where pilot’s presence is impossible. The BCI can also be used for patients suffering from brain diseases who lose their body control and are unable to move to satisfy their basic needs. By taking advantage of BCI and drone technology, algorithms for Mind-Controlled Unmanned Aerial System can be developed. This paper deals with the classification of BCI & UAV, methodologies of BCI, the framework of BCI, neuro-imaging methods, BCI headset options, BCI platforms, electrode types & their placement, and the result of feature extraction technique (FFT) with 72.5% accuracy

    Using Multi-Theory Model to Explain HIV Screening Behaviors Among College-Aged People

    Get PDF
    There is a need to examine novel approaches that explain the association between the initiation and sustenance of HIV screening behavior. This quantitative, cross-sectional study was used to examine if the components of the multi-theory model (MTM) of health behavior change can explain the initiation and sustenance of HIV screening behavior among college-aged people. A convenience sample of 151 consenting college-aged people between the ages of 18 and 34 in a western US state completed a self-administered 44-item instrument. Multiple regression analysis was used to assess the correlation between the constructs of the multi-theory model of health behavior change (independent variables) and the decision of college-aged people to initiate and sustain HIV screening (dependent variable) as a health behavior change. The results showed that two out of three initiation constructs explained 27.6% of the variance in the behavioral change to initiate HIV testing (adjusted R2 = 0. 276, F (7, 143) = 9.173, p \u3c 0.001), and two of three sustenance constructs explained 36.1% of the variance in the behavioral change to sustain HIV testing (adjusted R2 = 0.361, F (7, 143) = 13.130, p \u3c 0.001). Implications for positive social change include justification of the utility of the multi-theory model of health behavior change to build evidence-based health education programs for public health agencies and interventions for primary care practitioners to address the growing incidence of HIV

    Approximate Bayesian techniques for inference in stochastic dynamical systems

    Get PDF
    This thesis is concerned with approximate inference in dynamical systems, from a variational Bayesian perspective. When modelling real world dynamical systems, stochastic differential equations appear as a natural choice, mainly because of their ability to model the noise of the system by adding a variant of some stochastic process to the deterministic dynamics. Hence, inference in such processes has drawn much attention. Here two new extended frameworks are derived and presented that are based on basis function expansions and local polynomial approximations of a recently proposed variational Bayesian algorithm. It is shown that the new extensions converge to the original variational algorithm and can be used for state estimation (smoothing). However, the main focus is on estimating the (hyper-) parameters of these systems (i.e. drift parameters and diffusion coefficients). The new methods are numerically validated on a range of different systems which vary in dimensionality and non-linearity. These are the Ornstein-Uhlenbeck process, for which the exact likelihood can be computed analytically, the univariate and highly non-linear, stochastic double well and the multivariate chaotic stochastic Lorenz '63 (3-dimensional model). The algorithms are also applied to the 40 dimensional stochastic Lorenz '96 system. In this investigation these new approaches are compared with a variety of other well known methods such as the ensemble Kalman filter / smoother, a hybrid Monte Carlo sampler, the dual unscented Kalman filter (for jointly estimating the systems states and model parameters) and full weak-constraint 4D-Var. Empirical analysis of their asymptotic behaviour as a function of observation density or length of time window increases is provided

    Three Essays on Accounting Disclosure and Information Environment

    Get PDF
    The first paper investigates the long-run effects of fair value level disclosures on the information environment. SFAS 157 introduced mandatory disclosures about three-level fair values in 2008. Using panel data of firms' quarterly disclosures and quarterly summarized daily stock trades, we find that a higher fraction of fair value levels 2 and 3 to total assets reduces information asymmetry in the equity market. Results are consistent with the view that more disclosures improve the information environment. Furthermore, we investigated the boundaries of the primary effect. The effect is less pronounced for firms with higher-quality ex-ante information environment. The higher is the presence of dedicated institutional investors among the shareholders, the positive effect of disclosure is more pronounced, which confirms the usefulness of SFAS 157 disclosures to the market participants. On the contrary, the effect of the disclosure on the bid-ask spread attenuates with transient institutional holdings, as they are more advantageous in analyzing the newly released sophisticated disclosure contents. Results hold for both financial and non-financial firms and are robust to various specifications and estimation methods. The second paper examines the effect of Fair value measurement levels according to SFAS 157 on information asymmetry among the U.S. corporate bonds market investors. We find that the bid-ask spread of bonds is positively associated with the ratio of total fair value to total asset, and its magnitude is higher for level 3 and level 2 assets. It implies that information asymmetry is more substantial for firms with more opaque financial assets. These results support the view that bondholders' non-linear payoff function makes them demand more conservative accounting practices. The result holds for both the financial and non-financial sectors and is robust to linear and log-linear specifications. The third paper studies the effect of financial reporting transparency on the liquidity creation function of banks. Recent theoretical models suggest that banks are secret keepers, and by keeping information about the firms secret, banks can provide money like safe liquidity to depositors. This model implies that transparency harms liquidity creation. The previous empirical literature has treated the asset and the liability sides of banks' balance sheets separately. This study aims at connecting the two sides and measuring the effect of assets transparency on liquidity transformation. Using CALL reports, I find that Delayed Expected Loss Recognition measure of opacity and CAT FAT measure of liquidity creation are negatively associated, most significant for small banks

    Evaluation and optimal design of spectral sensitivities for digital color imaging

    Get PDF
    The quality of an image captured by color imaging system primarily depends on three factors: sensor spectral sensitivity, illumination and scene. While illumination is very important to be known, the sensitivity characteristics is critical to the success of imaging applications, and is necessary to be optimally designed under practical constraints. The ultimate image quality is judged subjectively by human visual system. This dissertation addresses the evaluation and optimal design of spectral sensitivity functions for digital color imaging devices. Color imaging fundamentals and device characterization are discussed in the first place. For the evaluation of spectral sensitivity functions, this dissertation concentrates on the consideration of imaging noise characteristics. Both signal-independent and signal-dependent noises form an imaging noise model and noises will be propagated while signal is processed. A new colorimetric quality metric, unified measure of goodness (UMG), which addresses color accuracy and noise performance simultaneously, is introduced and compared with other available quality metrics. Through comparison, UMG is designated as a primary evaluation metric. On the optimal design of spectral sensitivity functions, three generic approaches, optimization through enumeration evaluation, optimization of parameterized functions, and optimization of additional channel, are analyzed in the case of the filter fabrication process is unknown. Otherwise a hierarchical design approach is introduced, which emphasizes the use of the primary metric but the initial optimization results are refined through the application of multiple secondary metrics. Finally the validity of UMG as a primary metric and the hierarchical approach are experimentally tested and verified

    Utilizing different PAT tools to improve scale-up and process transfer for freeze drying

    Get PDF
    corecore